kawecki opened this issue on Dec 18, 2006 ยท 50 posts
kawecki posted Wed, 20 December 2006 at 12:52 PM
Quote - So you have your scene and it's enclosed in a sphere (skydome) and the image is mapped to the skydome. So again in simple terms, the information stored in the images pixels are looked at and then projected out onto the scene; much like the rays of the sun and they are bounced around like the sun's rays. Thus illuminating your scene.
This only works for objects that are a perfect ideal mirror. The illumination in an object is defined by its normal, the incidence angle of each light, the angle of the camera and the BRDF of its surface. Neither of this data is present in a HDRI image.
The images that you obtain can also be done by environmental mapping, that works in exactly the same way, the only difference is that in the first case you put the image in the sky dome and in the second case you put the texture on the surface of the object (mapped as a reflexion of an imaginary sky dome mirror).
The work of Paul Debevec is very interesting, but if you look better is directed toward the reconstruction of antique objects and artitecture.
Stupidity also evolves!