basicwiz opened this issue on Apr 07, 2009 · 88 posts
ice-boy posted Fri, 10 April 2009 at 5:48 AM
Quote -
As an aside, that (very informative) article that bb linked to started off by saying that the Shader Rate is our number one image quality knob. I tend to disagree with that. I think that our number one image quality control is render pixel dimensions. I did a test on Stonemason's Streets of the Med, rendering at 1200x1200 with a shading rate of 0.25, and then at 2400x2400 with a shading rate of 1.0. I did not time the renders exactly, but they took about the same time. I then reduced the larger image in Photoshop Elements to 1200x1200 (using bicubic resampling) and did a close up comparison. The larger render was far and away the better quality image, particularly in respect of anti-aliasing and more accurate reproduction of finer details such as the ivy leaves.
There are probably clear technical reasons why that should be the case, but this empirical evidence is more than enough for me to start doing larger renders in preference to reducing the shading rate.
.
i tryed now this.
i rendered an iamge 600x400. shading rate 1
then i rendered the same image 1200x800. so one time bigger. then in photoshop i made the bigger render 600x400. and when i compare the two renders the second one is better quality.
hmmmm i will need to do more tests now. lower shading rate is better. but how better.
i almost always need to use blur after i render because the render from poser is super-sharp. to sharp.