kawecki opened this issue on Dec 18, 2006 · 50 posts
rigul64 posted Wed, 20 December 2006 at 10:05 AM
@ **Angelouscuitry ** The lighting information is gathered from the image's pixels. It does affects the color of the objects depending on the image used.
I'll try to explain this as simply as possible.
OK, in simple terms an HDR image contains so much more lighting information than a standard digital image that it mimics real world lighting.
When you use an image to illuminate a scene instead of the standard digital lights this is referred to as Image Based Lighting (IBL). Now you don't have to use an HDR image for IBL, but they're used because of what I stated previously.
So you have your scene and it's enclosed in a sphere (skydome) and the image is mapped to the skydome. So again in simple terms, the information stored in the images pixels are looked at and then projected out onto the scene; much like the rays of the sun and they are bounced around like the sun's rays. Thus illuminating your scene. Now I would like to say that is common to use standard digital lights in conjuction with IBL. One reason I can think of right now is to help in defining the shadows in the scene.
So this is a "watered" down explanation and hope this helps in your understanding of HDRI and IBL also.
I have set up some examples using different hdri's and also a jpeg.