vincebagna opened this issue on May 04, 2010 ยท 14 posts
bagginsbill posted Tue, 04 May 2010 at 11:24 AM
Quote - Oh, and Bill, you've talked some times ago about a sort of volumetric effect shader, something you've applied to a sphere to get that sort of magical spelling effect, could you talk more about it? :)
I was actually trying to make a cloud generator for the EnvSphere, something that generates 100 miles of clouds. I did get something that was moderately correct, but I never got it perfect. I lost interest.
However, for a localized cloud (contained within a 50 foot or smaller prop) it worked pretty well.
It involves a lot of trigonometry.
Here's how I did it.
First I assumed the camera was at 0, 0, 0. This isn't true, but it's close enough to make it work OK.
The shader calculates the vector formed by the camera and the point being shaded. This gives me altitude and a vector I can use to project the camera's viewpoint into space.
From the shaded point, I then project multiple coordinates along the view vector. At each of these points, I evaluate a clouds-type of node, such as fBm or Fractal_Sum.
These values are blended together, from back to front, forming a sampling of how light rays from all those points would combine as they move towards the camera.
The more points I use, the better the approximation becomes. In the limit as the number of points approaches infinity, a perfect volumetric rendering of the 3D cloud is possible. But that isn't practical.
In attempting to do a finite number of sample points, I had to choose a spacing for these points. In the case of the EnvSphere, I first chose simulated concentric spheres, and used the points where the vector intersects those spheres. But the cloud layer is actually bounded by a lower and upper altitude. The intersections of the spheres with these boundaries leads to many samples lying outside the cloud layer, and becoming wasted calculations that produce "nothing here". So I attempted to slide the samples down so they are always within the boundary layer. But as the vector approaches horizontal, the spacing increases dramatically. Looking straight up, the samples might be 50 feet apart. Looking close to the horizon, they would be miles apart. The result was that at certain angles, the illusion of smoothly varying cloud density would disappear, and the concentric spheres would become visible.
I then tried various ways of skewing or bending the vector, but nothing worked right. Or I wasn't implementing them correctly. Either way, it became an overwhelmingly difficult problem to work on using only Poser nodes. If I could just write a straightforward program with variables and loops, I think I could do it. But just expressing everything with a constant set of nodes seems very difficult.
These sampling problems don't occur in the bounded cloudy sphere case. Uniform sampling is pretty good.
Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)