Philywebrider opened this issue on Oct 09, 2004 ยท 15 posts
AntoniaTiger posted Sat, 09 October 2004 at 9:28 AM
Sounds like there's two different things involved... The brightness range -- digging out my stuff on photography, 8-bits is a little bit wider range than the best film can record. It's close to the range of brightness the eye can deal with. It's convenient for computers, but it works for what we can see. Trouble is, as Aeneas points out, reality has a higher brightness range. The photography, cine or still, exposes the scene to get the usable brightness range to record what she's intwerested in. The human eye adapts. We don't see everything at once, but we can look at every part of the scene and see it. So eight bits per channel for the final image is quite alright. But a few extra bits while generating that image isn't a waste. And we can do things which aren't so easy for the photographer. In effect, in the virtual world of CGI, we can do what used to be done in a darkroom, without the intermediate of a negative. The gamma curve, for instance. This is the relation between input and output, and in the darkroom will show in the contrast grade of the photographic paper. There are ways of developing the film to get similar changes in the film. In the digital world we don't risk losing, or spoiling, the film in the developing process. BTW, you do know that a print on paper can never show the brightness range available from a monitor or projected image? Now, the part that puzzles me, which I can't relate to my experience of film photography, is this stuff about lighting. I'm getting some hints about it, but I'm not sure it's entirely relevant to what many of us do. Are we combining CGI with real images?