JAG opened this issue on Mar 11, 2010 · 19 posts
bagginsbill posted Thu, 11 March 2010 at 12:23 PM
It's a faster way of anti-aliasing than super sampling the entire scene.
When you have a texture image that has fine, regularly spaced details, such as parallel lines, and you render it up close so that the details are bigger than a rendered pixel it looks fine. But if you move the object farther from the camera, the details become smaller than a pixel. Then it's a crapshoot as to whether or not any individual detail is picked up by a given pixel or not. If you don't do something about this, you have aliasing, or worse, moire patterns.
Try making a texture image with parallel alternating black and white lines. Make some large, others medium, and some really small ones. Put this on a prop and render it at various distances and angles. You'll see all sorts of ugliness. Suppose there's a spot where there are 7 parallel lines spanning 10 pixels in the render. Instead of smoothly varying shades of gray, you'll see white, white, black, white, white, white, black, etc.
One way to avoid this is to super-sample the scene - i.e. use a very small Min Shading Rate. But doing that costs time on the whole scene.
Texture filtering generates smaller versions of the texture for internal use by the renderer. Each smaller version has the details smoothed out - essentially blurred - so that sampling them far apart picks up the same effective values as super sampling, but ONLY for this texture image. This greatly speeds up the render without producing aliasing. The renderer chooses which reduced size version to use based on distance and angle to the camera, on the fly. Effectively, it has pre-calculated, in 2D instead of 3D, a bunch of super-sampled versions of the texture. 2D super sampling is faster than 3D super sampling, and since it is automatically using different levels of super-sampling, it automatically adapts to the viewing distance and angle.
However, I think Poser is a little too aggressive about which super-sampled image to use. We see it using a more blurred version than it needs to. I have never found a way to adjust this, and it happens to varying degrees depending on the size of the original texture image.
When you turn off texture filtering, you are giving up that super sampling, which means very small features can be lost. But the features that are not lost are sharper. So you have to choose - sharper, but inaccurate (perhaps even missing chunks of) details, or smoother more accurate details. On eyelash transmaps, you really do want to use texture filtering. On skin, maybe not. Maybe you don't care if a mole disappears, as long as other larger features are still sharp.
Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)