Wed, Nov 27, 2:20 AM CST

Renderosity Forums / Poser - OFFICIAL



Welcome to the Poser - OFFICIAL Forum

Forum Coordinators: RedPhantom

Poser - OFFICIAL F.A.Q (Last Updated: 2024 Nov 26 1:43 pm)



Subject: What is HDRI?


Philywebrider ( ) posted Sat, 09 October 2004 at 6:33 AM ยท edited Wed, 27 November 2024 at 2:03 AM

What is HDRI?


RubiconDigital ( ) posted Sat, 09 October 2004 at 6:48 AM

Attached Link: http://www.debevec.org/

In a nutshell, it stands for High Dynamic Range Images. They are images that contain more information than the usual 24 bits we are used to. HDR images can be used to illuminate a scene, using just the colour and brightness values in the image. This means you can realistically light a scene with no lights. There's more to it than that, but that's the bare bones basics of it. The link here is a good place to start. I had another good one, but the site seems to be down.


swfreeman ( ) posted Sat, 09 October 2004 at 6:50 AM

Ordinary digital images in RGB format only contain up to 16.7 milion colours (8-bit per colour channel and one 8-bit alpha channel, making it a total of 32-bit). A HDRI (High-Dynamic Range Image) can contain a helluva lot more. By taking a picture in several different exposures and combining them into one HDRI, you have an image with colour data way more than 24-bit. You have 16-bit per colour channel, making it a 48-bit image. If you would add an alpha channel, it would be a 64-bit image. HDRI is used to light environments (quite an easy way to make it look somewhat realistic if you already have a HDRI) and for reflection. Real-life speculars are in fact reflections of the light source. To calculate this in 3d takes some time. That's why we use specular shading (blinn, phong and so on). An HDRI reflection, however, can create more realistic looking speculars, since they are based on the HDRI's brightest spots, and not point-lights (ordinary omni, spot-light and directional)


swfreeman ( ) posted Sat, 09 October 2004 at 6:53 AM

file_133288.jpg

the background is in fact a sphere surrounding the scene, with the hdri texture applied wich illuminates the scene


swfreeman ( ) posted Sat, 09 October 2004 at 6:55 AM

What is HDRI..? HDRI stands for "High Dynamic Range Image". In short it is a specialized photographic technique that allows one to capture the exact lighting conditions and reflections on-set. By using HDRI maps in 3D software, one is able to seamlessly integrate computer generated objects into a live action environment, and thus achieve completely believable results. This is due to the fact that the correct lighting, highlights, contact shadows and reflections are contained in the HDRI photographs captured on-set during the live action shoot. The HDRI data can now be used in 3D to recreate the exact same lighting conditions that was present in the 'real world' at a specific moment in time. Realism is thus far easier to achieve in 3D, and seeing that one does not have to try and match the lighting conditions from scratch, HDRI dramatically cuts down on the lighting time in production. How does HDRI work..? The on-set lighting conditions are captured with a Fisheye lens (capable of taking 180 degree photographs). A HDRI photograph is unique in the fact that it stores both colour and intensity values of the environment, as the image integrates numerous exposures (camera stops) of the environment in one file at an extraordinary high colour depth and image resolution. This allows one to realistically control the lighting of computer generated objects in a real or 3D environment. In 'Old School' Post Production terms, the level of exposure control that HDRI offers in 3D, can be compared to the flexibility of grading 35mm film-stock in TeleCine. How can we use HDRI in Post-Production..? HDRI can be used in all the high-end 3D software packages today. Once the HDRI photographs are unwrapped and converted, it can be used in production, saving on 3D lighting and texturing time, and ensuring realistic results for film projects and high-end television commercials.


RubiconDigital ( ) posted Sat, 09 October 2004 at 7:03 AM

Attached Link: http://www.highpoly3d.com/writer/tutorials/hdri/hdri.htm

Here's that link I was looking for. Actually, I think you'll find hdr images have a lot more than 16 bits per channel.


Aeneas ( ) posted Sat, 09 October 2004 at 7:09 AM

On a monitor, nothing can be brighter that the lightest colour it can reproduce. Compared to the light of a flash, or the sun, this is nothing. Yet it is that range of light intensity that enhances objects in real life. This is why scientists developed a technique to be able to add "act-as-if" intensity to the lightness of an image so that it becomes more natural. This is called an image with a higher dynamic range, or HDRI image. These images can be used in certain software to enhance its dynamic range. Unfortunately not all software can make use of this technique.

I have tried prudent planning long enough. From now I'll be mad. (Rumi)


AntoniaTiger ( ) posted Sat, 09 October 2004 at 9:28 AM

Sounds like there's two different things involved... The brightness range -- digging out my stuff on photography, 8-bits is a little bit wider range than the best film can record. It's close to the range of brightness the eye can deal with. It's convenient for computers, but it works for what we can see. Trouble is, as Aeneas points out, reality has a higher brightness range. The photography, cine or still, exposes the scene to get the usable brightness range to record what she's intwerested in. The human eye adapts. We don't see everything at once, but we can look at every part of the scene and see it. So eight bits per channel for the final image is quite alright. But a few extra bits while generating that image isn't a waste. And we can do things which aren't so easy for the photographer. In effect, in the virtual world of CGI, we can do what used to be done in a darkroom, without the intermediate of a negative. The gamma curve, for instance. This is the relation between input and output, and in the darkroom will show in the contrast grade of the photographic paper. There are ways of developing the film to get similar changes in the film. In the digital world we don't risk losing, or spoiling, the film in the developing process. BTW, you do know that a print on paper can never show the brightness range available from a monitor or projected image? Now, the part that puzzles me, which I can't relate to my experience of film photography, is this stuff about lighting. I'm getting some hints about it, but I'm not sure it's entirely relevant to what many of us do. Are we combining CGI with real images?


ockham ( ) posted Sat, 09 October 2004 at 12:52 PM

Is it something like compression in audio? That is, reshaping the color intensities in a logarithmic way to compensate for the logarithmic nature of the sensor?

My python page
My ShareCG freebies


SteveJax ( ) posted Sat, 09 October 2004 at 3:44 PM

How do you use HDR files in Poser? I tried following Stefan's example and loading a sphere and applying the HDR file as a ProbeLight Node but it wouldn't load.


RubiconDigital ( ) posted Sat, 09 October 2004 at 6:18 PM

Steve, it's impossible to use hdr images in Poser. Poser won't accept the .hdr image format for a start and its somewhat rudimentary render engine is totally incapable of doing any radiosity calculations. You'll need software that can specifically calculate these types of solutions - LightWave, Max, Maya et al, Carrara and Vue now as well, apparently. There is a poor man's hdri setup available at er, RDNA I think? that attempts to simulate this type of thing, but it's still a fake. I'm not sure I understand that question ockham :)


SteveJax ( ) posted Sat, 09 October 2004 at 6:35 PM ยท edited Sat, 09 October 2004 at 6:36 PM

Yeah I figured out the Poorman's Poser work around from Eric's Tutorial link at Stephan's Site and you're right. It doesn't suffice.

Message edited on: 10/09/2004 18:36


ynsaen ( ) posted Sat, 09 October 2004 at 7:32 PM

Firefly is capable of IBL based rendering utilizing HDRI natively. The ProbeLight node is the current (sad) access to this capability. It does not accept hdr files. It does accept IBL values created from hdr files. Steve -- the script is run outside of Poser, in a command line window. It creates a list of values that are then typed into the value params for the probelight node. By hand, lol. This probelight node is then copied to all surface paramters within the scene (copy paste is useful). When doing so, it is generally better to apply it to the alternate diffuse channel. The PMHDRI set up is a fake -- it uses actual lights. To engage this completely, one does not use any of the lights within poser.

thou and I, my friend, can, in the most flunkey world, make, each of us, one non-flunkey, one hero, if we like: that will be two heroes to begin with. (Carlyle)


SteveJax ( ) posted Sat, 09 October 2004 at 8:11 PM

Now you tell me! O.o It only took me an hour to figure out that .hdr files were worthless in Poser and the Script file at Stephan's site doesn't like the HDR files I spent time downloading from http://athens.ict.usc.edu/Probes/ I did finally get the script to creat the 20 light fake setup in a pre-existing PZ3 where I'd deleted the lights and it turned out, er, um, well ok, if you think a Battle Star Galactica Viper should be that color in real life. LOL!


flyerx ( ) posted Sun, 10 October 2004 at 6:13 PM

Attached Link: http://user.txcyber.com/~sgalls/

With Poser or DAZ Studio you can do a simulation of HDRI illumination using PoseRay.

PoseRay home:
http://user.txcyber.com/~sgalls/

later,

FlyerX


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.