Mon, Nov 11, 2:54 PM CST

Renderosity Forums / Poser - OFFICIAL



Welcome to the Poser - OFFICIAL Forum

Forum Coordinators: RedPhantom

Poser - OFFICIAL F.A.Q (Last Updated: 2024 Nov 11 2:16 pm)



Subject: Nodes for Dummies


RobynsVeil ( ) posted Thu, 28 May 2009 at 8:04 PM

You had mentioned at some point in the past I needed to combine functionality into um, functions, I guess. I was clueless on where to begin. As I look over your code, I can see more clearly why I was clueless: I was missing a lot of the major building blocks for modularization. As I am now painfully (embarrassingly) aware, my code reflects a profound lack of knowledge in Python programming. I'll get it, but it may take a bit.

This isn't just a script for a shader, Bill, it's a mini-course on Python programming. I hope you don't mind if I ask a few gazzilion questions as to why and how and what is this doing and...

I've never seen anything so elegant... nevertheless, it will take some real concentration to even understand this code, let alone write anything along those lines or with that degree of sophistication.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


bagginsbill ( ) posted Thu, 28 May 2009 at 8:06 PM

Just what I thought you'd say. Ask away...


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


RobynsVeil ( ) posted Thu, 28 May 2009 at 9:30 PM

Thanks for that, Bill. Really going to try to keep my questions to one concept at a time...

Quote - def PM(x, lbl): return Add(x).labelled("PM:" + lbl)
PM makes a parameter node - takes a number and a label.

--Um, this "parameter node" is a math_function (Add) node?
--with all the goodies stubbed in, like what value it's going to accept (x) and "PM:" plus the label?

That init function... that's a new critter for me.
--Are we actually creating a new class, upon which objects can be based?
--So, when defining a class, that init thingie is pretty much required?
--This is not a constructor, just a class definition?
--And you define this class "CSB" as an object-type class with that parameter "object"?

I'll send this first, and puzzle some more over the rest. I like this language the more I see (understand) of it.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


RobynsVeil ( ) posted Thu, 28 May 2009 at 10:15 PM

Getting back to that init function, there's really quite a bit to it, isn't there?
You're passing 4 parameters:
item, color, shine, bump

One of those parameters (the first one) is going to define an object, and the other three parameters are actually going to be members of this object:
        item.color = color
        item.shine = shine
        item.bump = bump

With these three statements, you actually assign those parameters to the object as members of that object?

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


bagginsbill ( ) posted Thu, 28 May 2009 at 10:25 PM

Quote - Thanks for that, Bill. Really going to try to keep my questions to one concept at a time...

Quote - def PM(x, lbl): return Add(x).labelled("PM:" + lbl)
PM makes a parameter node - takes a number and a label.

--Um, this "parameter node" is a math_function (Add) node?

Yes. If you look in many of my shaders, you'll see nodes whose label is PM:something. These are parameter nodes - nodes that I've called out where you are to be aware that you can change this parameter. It also tells my parmatic utility that it is a shader parameter and people can manage parameter values on the figure using parmatic. They don't have to use VSS. Parmatic can create a dial on the figure for each of these parameter nodes, and no matter how many of them there are, they only have to spin one dial to change all the ones that have the same label, across many material zones. That's another story, not worth going into right now. It suffices to understand that a parameter node is something I want the user to clearly see it's for tweaking or connecting things into the shader at that point. In most of my newer shaders, I automagically arrange all these down the left side of the material room so they can be found easily. I didn't show you how to do that in this demo. One thing at a time.

Quote -
--with all the goodies stubbed in, like what value it's going to accept (x) and "PM:" plus the label?

Yes - it says what it does. It creates a Math:Add node, plugs x into Value_1 and labels it "PM:" + label.

Quote - That init function... that's a new critter for me.
--Are we actually creating a new class, upon which objects can be based?

The "class CSB" statement is creating the new class. The init is the constructor for the class, i.e. this is the procedure for building an instance of the class. There are many "plumbing" functions in Python class definitions like this - they all begin and end with double underscore. The init function is just one of them. Another would be add, which is instructions on what to do if an instance of this class is used in an expression involving the + operator, as in a + b, where a is an instance of the class. There are tons of these and they are magical because of the name. Over time you can get to know them all. But the first and most important is the constructor method. The matmatic node classes are filled with these. That's why you can do stuff like Clouds() + Spots(). That's adding two nodes together, which requires the creation of yet another node to represent the sum.

Quote -
--So, when defining a class, that init thingie is pretty much required?

No it isn't required. You can inherit the constructor of the base class, and if there are no additional steps to build your particular type of object, then you don't have to implement your own constructor. However, if you have additional steps, or additional arguments to process that the base class doesn't already understand, then you must implement a constructor.

Quote -
--This is not a constructor, just a class definition?

I don't know what you're referring to when you used the word "This". CSB is a class, and init is a member function of the class whose job is to be the constructor for the class. CSB is not the constructor, init is, and init is not a class, it is the constructor for the CSB class.

Quote -
--And you define this class "CSB" as an object-type class with that parameter "object"?

Yes, the argument to a class declaration is the list of base classes from which your new class is derived, or inherits from. Since I had no need of some interesting base class I derived from the most basic class (or type) that Python has - the object class. This is the base class of everything and it's what you use when you have nothing in particular in mind to start with. All I wanted was a very simple type of object that has three attributes and one new method (mix), in addition to the methods that all objects have, because they are objects.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Thu, 28 May 2009 at 10:28 PM

Quote - Getting back to that init function, there's really quite a bit to it, isn't there?
You're passing 4 parameters:
item, color, shine, bump

One of those parameters (the first one) is going to define an object, and the other three parameters are actually going to be members of this object:
        item.color = color
        item.shine = shine
        item.bump = bump

With these three statements, you actually assign those parameters to the object as members of that object?

Correct. All member functions of a class are instructions on how to do something with instances of that class. The init function is the special member function describing what to do with a brand new one, and what arguments are necessary in order to create one.

The constructor is invoked by a syntax that appears to call the class name as if it were a function.

CSB(1, 2, 3)

Is actually creating a new instance of the CSB class. This is handled by the internal plumbing of the Python engine. Let's call that newcsb. Then the internal plumbing of the Python engine calls newcsb.init(1, 2, 3).

In the implementation of init, I copy the 3 arguments into attributes of the object, thus they become members of the object. Until I did that, the instance did not actually have any attributes by those names.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Fri, 29 May 2009 at 9:31 AM

just a quick note to say i have to look at this closely tonight or over the weekend.  but i am following, and appreciate the lesson.



RobynsVeil ( ) posted Fri, 29 May 2009 at 7:45 PM

Quote - def CSBSurface(item):

Here's the meat. And no, I haven't seen formulas like this anywhere. I spent ages trying to get my head around your reflectionValue Blender node.  Your accompanying explanation pretty much covered what was happening here but questions remain.

There's two levels of inquiry, here. One: Python(Poser) programming. Creating classes is a new wrinkle, so I'll have to play with that for a bit to make sure I get it. You've covered that bit really well.
The other, more elusive aspect are your formulas... and for now I'm just going to accept them as is, without any discussion. It's like in my first class in nursing and the instructor asked: "Any questions?" and we looked at each other in bewilderment... no one had a clue, no one understood enough to even pose any questions.

I can ask programming questions. Formulas? Not yet. They are obviously the product of years and thousands of hours of experimentation and a profound knowledge of mathematics as it applies to (poser) functions.

This doesn't mean I don't want to know. I just don't want to ask for information I'm not ready to process yet.

= Blend((1 - .18 * EdgeBlend(1, 0, 1)) ** 30.8, 1, .03)
is not
= Diffuse(color, .85 * (1 - specular))
in terms of degree of complexity. I can get my head around what thought process led to the .85 value for diffuse: trial and error? and the (1 - specular) is brilliant and yet the logic is quite easy to follow.
That Blend for reflectionValue, though... I would never have thought to plug in an EdgeBlend into the input_1 channel. And upping that whole value to the power of 30.8???

What's equally important to me was having a more macro look at this script in terms of anti-gamma -> process -> gamma. I'll admit, Bill, I look at all these formulas with an ulterior motive: to apply the information you've provided in this shader to gamma-correcting - heck, or just plain-ol' correcting - existing shaders.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


kobaltkween ( ) posted Mon, 01 June 2009 at 12:40 AM · edited Mon, 01 June 2009 at 12:41 AM

ok, i think  i have a quick question.

i have your sRGB material for the artistic lens as an example of an end product.
i have your incorrect (according to you, i wouldn't have spotted the problem) sRGB Matmatic functions

Quote - def IF(test, tv, fv):
    return Blend(fv, tv, test)

def SRGB(x):
    return IF(x <= .0031308, 12.92 * x, 1.055 * (x ** (1/2.4)) - .055) def ASRGB(x):
    return IF(x <= .04045, x / 12.92, ((x + .055) / 1.055) ** 2.4)

so now i'm just playing with how to make a switch and apply different functions.  i know the node chain i want to make, but i can't seem to get there.  just to play and try and understand switches, i've tried

Quote -
def combineCorrect(test, f1, f2):
    return Add(test * f1, (1 - test) * f2)

def testSwitch(x):
    return combineCorrect(x <= 0.5, -0.25 * x, x * 1.75);

so the aspect of what i'm not getting when i look at the material i make is actual numbers for 0.25 and 1.75.  they make colors in a color math node instead of numbers leading to a color math node.  so i'm wondering how i should express it to get a color math node multiplying the input of x by an input of a math node with a numeric value rather than by a color swatch.
 



kobaltkween ( ) posted Mon, 01 June 2009 at 12:45 AM

oh, and just to say, i know i may be coming at it completely wrong, so feel free to correct me.



bagginsbill ( ) posted Mon, 01 June 2009 at 1:48 AM · edited Mon, 01 June 2009 at 1:50 AM

Any place you use a number in combination wtih a color, the number gets promoted to a color by repeating that value in all three color components. Any place you use a number, such as .5, and a color is needed, it just turns that into Color(.5, .5, .5).

Well after studying how Poser works, I discovered that it accepts hypercolors for color parameters. What this means is, instead of building a separate math node, putting 1.75 in it, and plugging that into a white input, I just put Color(1.75, 1.75, 1.75). It turns into the same thing, without having an additional do-nothing node just to hold a number.

We cannot enter or even correctly edit numbers like this in the material room. They display incorrectly in the parameter value. But they are really there.

If it bothers you to have numbers as hyper-colors in the material that are impossible to enter by hand, or you would like to be able to manipulate these numbers while in the material room, just wrap them in a call to Add, as in

return combineCorrect(x <= Add(.5), Add(-.25) * x, x * Add(1.75))

Calling Add is asking matmatic to explicitly create a Math:Add node, with Value_1 being what you pass. You can pass Value_2 as well, but we don't need a Value_2 and it defaults to 0. In fact we don't need a Math:Add at all, but Poser does not have a SimpleNumber node.

But this isn't buying you anything really, unless you plan to change the numbers in the material room.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Mon, 01 June 2009 at 2:21 AM

gotcha!  thanks so much.  i was thinking that's what i was observing,but it's hard to tell.  especially when i'm trying to do it so i can see what i'm doing.

i'm still having difficulties getting it right.  right now i have:

Quote -
def IF(test, tv, fv):
    return (test * tv) + ((1 - test) * fv)

def sRGB(x):
    return IF(x <= .0031308, 12.92 * x, (1.055 * (x ** (1/2.4))) - .055)

def asRGB(x):
    return IF(x <= (12.92 * 0.0031308), x / 12.92, ((x + .055) / 1.055) ** 2.4)

and i'm anti-correcting the color on the way in, and the finished product on the way out. i'm sure i'm still getting it wrong, but i'm not sure how to get it right.



bagginsbill ( ) posted Mon, 01 June 2009 at 2:42 AM

The test looks right. And if I published those other formulas, they're right, too. Heheh.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Mon, 01 June 2009 at 2:48 AM

I've been playing with a new method, one that doesn't require any anti-whatever on the way in.

I used something like this in my AMUCFS shader, but I didn't really understand the math very well at the time, I did it in a more complicated way, and I didn't really explore how imporant it is.

You can try experimenting with this.

Build a shader adding together all the elements of diffuse, specular, reflection, and so on as you'd do for a GC shader, but skip all the incoming anti-GC stuff.

At the end, given your assemblied linear color, c, you use:

s.Alternate_Diffuse = c * (HSV(c, 1, 0, 1) ** -.4)

Wacky, huh? 


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


IsaoShi ( ) posted Mon, 01 June 2009 at 4:27 AM

OMGoodness. Do you have sticky-out hair too?

"If I were a shadow, I know I wouldn't like to be half of what I should be."
Mr Otsuka, the old black tomcat in Kafka on the Shore (Haruki Murakami)


RobynsVeil ( ) posted Mon, 01 June 2009 at 6:32 AM

Sheesh, everyone moves so fast... I'm still trying to get my head around your "Mix" script you posted a page or so back, so please excuse me if I bring the thread back to that.

I'm still digesting the functionality of the script... this bit:

class CSB(object):
    def init(item, color, shine, bump):
        item.color = color
        item.shine = shine
        item.bump = bump
    def mix(a, b, mask):
        return CSB(
            Blend(a.color, b.color, mask),
            Blend(a.shine, b.shine, mask),
            Blend(a.bump, b.bump, mask)
        )

It is, after all, the key to the whole thing. A function referring to itself is mind-boggling. Can you point me to a remedial Python class programming site that discusses how this is supposed to work? Recursion - to my knowledge - isn't even allowed in VBA, so I never had to get my head around it. Obviously everyone else gets this, so I don't wanna hold anyone else up, but I need to be able to visualize (conceptualize) what the heck is going on.

--So, when you call CSB (the class), you create a new instance of CSB which has a Mix() function in it.

Quote - All member functions of a class are instructions on how to do something with instances of that class.

mix and init are not something to reference anywhere from outside the function.... their purpose is to define functionality to the class itself. "This is what you're made of " and "this is how you will do with things you're given."

Sheesh, I'm not even making sense to me.

I've written a lot of functions in VBA which did fairly (I thought) clever things and were reusable because they assumed nothing - did parameter-validity checking and all that, but jeez, this is an entirely new level of coding. It must have something to do with the fact that we're defining a class, not a function, which is a completely different marsupial.

For instance, I look at a place down the script where you call - instantiate? - CSB():
  paint = CSB(AGC(Color(.2, .5, 1)), 1, 0)
and then I look at the CSB thingie and try to sort out where the values are being passed to.
Looks to me like we're really calling Mix(x,y,z) since it accepts 3 parameters. So, breaking down this paint = thingie, we're sending CSB or Mix() 3 values:
--anti-gammaed Color(.2, .5, 1) -> Mix(a, b, mask)
--1 -> Mix(a, b, mask)
--0 -> Mix(a, b, mask)

Three Blender nodes are returned: colour, specular (shine) and bump.
AGC  Color(.2, .5, 1) is Blended (Input_1) with WHITE (Input_2) and there is no mask (meaning the Blending value will be the default, which is 0.5). That's the colour bit that is returned.
-- Is an identical Blender node for shine generated here?
-- And bump?

Before my mind explodes, I'll let you decide which one of these are worth answering.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


kobaltkween ( ) posted Mon, 01 June 2009 at 9:14 AM

i think mix is a method of the class CSB.  it's not so much referencing itself as referencing it's constructor, init.   and all the constructor does, really, is make an assign an object with the main properties we need to reference to define a surface (as stated above).  so it's basically saying:  class CSB takes an object and defines it's color, shine and bump properties.  the CSB mix method lets you assign a mix of colors to an item's color, a mix of shines to an object's shine, and a mix of bumps to an object's bump.  so if you keep creating CSB instances, you can keep mixing mixes.

bb- that is so cool!  so pardon, but why does that work? 



kobaltkween ( ) posted Mon, 01 June 2009 at 9:41 AM · edited Mon, 01 June 2009 at 9:47 AM

oh, and thanks so much for responding in the dead of night!  i knew the equations were right, and even understood based on the equations i found.  i was really figuring i was doing the end test poorly.

i can see now what you mean about quick testing.  is there a quick way to reload the materials folder listing?  that's actually what's slowing me down most.  i should be able to test the new handling right quick when i get home.

oh, and just as a comment to others:  i was getting all sorts of feedback that was confusing me at first because i had the MatmaticDemos folder full of Matmatic scripts i'd downloaded from forums and such.  i know it's not supposed to process any but the latest, but it actually made debugging more difficult for me.  i moved all the old stuff to my development runtime where i keep most of bagginsbill's shaders, utilities and props.  i think if you're not working on a script at the time, it's kind of helpful to put it (and the materials it generated) out of the way for a while.

i have to say, i'm finding this really fun.  would it be a good idea to turn your example into a Matmatic tutorial?  because the big thing i feel like the Matmatic documentation is missing is a step by step guide from the equivalent of "Hello World" to something more complex.  i know i'm kind of following that path myself (with that awesome Mix script, thanks so much!)

i'm thinking something like: start with how to make a basic matte surface, change the color, change the shininess, add bump, add displacement, add transparency, add correction, add conservation of energy, add that reflection trick (just to ask, i don't understand why the fresnel approximation isn't applied to the specular too?), add AO.  then maybe different types of surfaces: stone, metals (i know you at least change the color of specular and reflection, but i'm guessing you change your Fresnel approximation, too), SSS type materials, translucent and transparent materials, etc.

edited to add:  oh, and i'm thinking of a format of full code on the left with changes highlighted, render on the right, and explanatory text below it.  sound good?



bagginsbill ( ) posted Mon, 01 June 2009 at 10:39 AM · edited Mon, 01 June 2009 at 10:42 AM

Quote - A function referring to itself is mind-boggling

Stop right there! The class CSB is not a function! It is a class.

Read these two summaries carefully. Many of the sentences are similar, indicating that classes and functions have a lot in common, perhaps causing your confusion. Study the differences carefully.

function: An encapsulation of instructions to follow to perform some action automatically. It does so by combining multiple program statements into a single unit. The most common way (but not the only way) functions are created is via the "def" keyword  The def keyword requires a name by which the function will be known for future use. To use the function, i.e. to perform the action it describes, you type its name followed by a pair of parentheses. (There are other ways, too.) Functions can have inputs (arguments) that are used when following the instructions described in the function. When calling a function, it may have results (return values, there can be 0, 1, 2, any number of returned values) that are returned to the caller after it finishes executing. We can do quite a lot without functions, but we'd find ourselves saying the same things over and over with slightly different inputs.

class: An encapsulation of instructions to follow to build objects with certain capabilities automatically. It does so by combining multiple attribute and function declarations into a single unit. The most common way (but not the only way) classes are created is via the "class" keyword. The class keyword requires a name by which the class will be known for future use. To use the class, i.e. to build an instance of the kind of object it describes, you type its name followed by a pair of parentheses. (There are other ways, too) Classes can have inputs (arguments) that are used when following the instructions described in the class. When calling a class, it always has one result (return value - the newly created object) that is returned to the caller after it finishes executing. We can do quite a lot without classes, but we'd find ourselves saying the same things over and over with slightly different inputs.

In my script, the class CSB encapsulates instructions on how to build a certain type of object that I need. When you "call" the class, as in

paint = CSB(AGC(Color(.2, .5, 1)), 1, 0)

you are not invoking the mix method. You are invoking the class. The class internally builds a new object, then calls its init method WITH THOSE ARGUMENTS to initialize the new object.

So in that example, function init gets called with the arguments

item = the new CSB instance that is under construction because you called the class
color = AGC(Color(.2, .5, 1)
shine = 1
bump = 0

I wonder if you think that the mix function is part of the init function. It is not. The last line of instructions in the init function is the assignment to item.bump = bump. The init function then exits, returning to the caller which is inside the guts of Python.

The mix method is called later on the instances we created.

For example:

skin.mix(paint, paintMask)

In my script, skin and paint were created by the CSB class, and those objects "know" all the same things about what to do with themselves. They are identical in every way, except that they have different attributes, because I called the CSB class with different arguments when I made them.

So when we write

skin.mix

we are actually asking the skin object to look up its definition of mix. When the class built the skin object, it stored a little info in the object on how to find the class again, in case the object needs help with doing things. In this case, the skin object doesn't actually know how to mix (we never assigned a mix attribute to it) so it will go back to the class and ask for help. The class will give it its own copy of the mix function, with the skin bound automatically to the first argument of the function. This is called a "bound function". Until we invoked the mix on the skin, mix only existed in the class itself, as an "unbound function".

The mix function takes three inputs or arguments: a, b, mask. We just established that skin.mix created a bound function where a=skin automatically. Now all that remains is to fill in b and mask. Which is clearly what I did:

skin.mix(paint, paintMask)

so:

CSB's version of "mix" gets called, bound to skin, i.e. a = skin, and the other two inputs come from my argument list, so
a = skin
b = paint
mask = paintMask

Once these connections are made, the statements encapsulated inside the mix function are executed.

There is only one statement in there.

return CSB( ... )

This is not magical in any way. It is not recursion. We are calling the class CSB from inside a bound method of the object skin, while executing the method mix.

As always, calling the class creates a new object. The inputs (arguments) passed are used to qualify how to build this new object. In that sense, mix is quite straightforward. When you mix two CSB objects, you're trying to make a new CSB object who's attributes are a mixture of the two things being mixed. That's what it does - builds a new CSB object, and passes blended values for each of the new object's attributes.

Another analogy to classes is that of a factory. When you call the class, it's like you visit the factory and placed an order for a new item. During the visit, you customize the item they build for you by giving them input on the what attributes you'd like your new object to have. When you visit, someone takes your order. This is Python guts. The order taker doesn't actually know how to build your item - she just sends the order to a worker who actually knows what steps to follow. That worker is the init function. The worker builds you a new item and sends it down the chute, where it lands in front of you and you leave.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Mon, 01 June 2009 at 10:49 AM

By far, the best tutorial is the one on the Python website.

http://docs.python.org/tutorial/index.html

Chapter 9 is about classes.

http://docs.python.org/tutorial/index.html

Print it out and read it over coffee.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


nruddock ( ) posted Mon, 01 June 2009 at 11:06 AM

Quote - At the end, given your assemblied linear color, c, you use:

s.Alternate_Diffuse = c * (HSV(c, 1, 0, 1) ** -.4)

Wacky, huh? 

The curve for this is very similar to a gamma correction curve.
From WolframAlpha :-
Above forumla -> Graph[x/x^0.4,{x,0,1}]
GC(2.2) -> Graph[x^(1/2.2),{x,0,1}]


bagginsbill ( ) posted Mon, 01 June 2009 at 11:24 AM

Right! But it doesn't have that rapid rise from 0 that cobaltdream doesn't like. That's what causes terminators to look too sharp.

Also, while the curves are similar, this preserves hue and saturation! So you don't need to anti-gamma first. It kind of figures out what you'd get if you had done ant-gamma first, and produces similar results without doing so.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


nruddock ( ) posted Mon, 01 June 2009 at 12:12 PM

I'll mention that the exponent that gives an equivalent curve to gamma correction is 0.54545454 -> Graph[x*x^-0.54545454,{x,0,1}]

as x*x^-0.54545454 === x^0.45454545 === x^(1/2.2)

takes off math head and puts it away for a rainy day


nruddock ( ) posted Mon, 01 June 2009 at 12:47 PM

One more observation, after a look at the HSV<->RGB conversion definitions, the HSV node appears to be redundant, because when S == 0.0 you get a greyscale out, meaning that the correction is based on the V component [== max(R, G, B)] rather than each of RGB being corrected individually as just using a Color_Math node alone to the Pow.

Using a Math_Functions node for the Pow gives the same effect as the combination of HSV and Color_Math nodes.


bagginsbill ( ) posted Mon, 01 June 2009 at 1:24 PM · edited Mon, 01 June 2009 at 1:31 PM

nr:

I used HSV on purpose, with S = 0, specifically because max(R, G, B) is exactly what I want, but there's no simpler way to do that.

I could use Max(Max(Comp(0, c), Comp(1,c)), Comp(2, c)) instead, but the results would be identical. I'm not sure which is faster.

As for the Color_Pow, you're right it could be a Pow instead, I was being lazy.

Instead of

c * (HSV(c, 1, 0, 1) ** -.4)

we can avoid color arithmetic by letting matmatic know that the HSV tuple is not needed.

c * (HSV(c, 1, 0, 1).asNumber() ** -.4)

So, yes, using a Math_Functions node for the Pow gives the same effect, but not without the HSV.

Regular GC raises each color component independantly, based on the value of that component alone. I'm trying to raise them synchronized, based on the value of whichever component is brightest. That's different. The idea here is to choose a new luminance, based on the brightest luminance component using a normal GC curve. Calculate the increase in luminance - dividing the new luminance by the old luminance. Then scale the original color up by that ratio.

In steps:

L = HSV(c,1, 0, 1).asNumber # get the basis luminance

Letting g stand for the gamma value, we'd use this to calculate a new luminance:

L2 = L ** (1/g)

We want the ratio of new over old to scale the original color up in each component.

c * L2 / L

Let's expand L2 / L:

(L ** (1/g)) / L

Rewrite the denominator as an exponent and multiply instead of divide:

(L ** (1/g)) * (L ** -1)

Combine the exponents:

L ** (1/g - 1)

So the exponent we want is 1/g - 1. For a GC of 2.2, we'd use 1/2.2-1 = -.545454 just as you said. But I was trying to make it less aggressive, so I used g=1.65 instead of 2.2, which gives -.393939, or roughly -.4.

L ** -.4

Plugging it back we get:

c * L2 / L = c * (L ** -.4) = c * (HSV(c, 1, 0, 1).asNumber() ** -.4)

I specifically did not use g = 2.2 because I was trying to avoid the nasty sharp edge at 0. Of course the exponent is adjustable - I just threw -.4 out as an example.

We might use other definitions for L, such as a weighted sum of the three components based on how you convert things to black and white. For example, we could use the black-and-white-TV luminance rule:

L = (Color(.3, .59, .11) * c).asNumber()


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


IsaoShi ( ) posted Mon, 01 June 2009 at 4:25 PM

RV or BB or anyone... two quick questions, svp.

I'm learning Matmatic, and I've just written my first Matmatic script, which creates my Adjustable Background Gradient shader (q.v.). It's only a simple 10 node shader, but I'm quite proud of it for a first effort!

Anyway, I've included Parmatic nodes for the vertical position, spread and Intensity of the background gradient. Is there any way to tell Matmatic to put these nodes in certain (x,y) positions in the node view?

Also, is there a way to tell Parmatic what sequence to present the parameter dials in? It seems to choose a sequence based on nothing in particular, and won't change it no matter what I do.

Thanks
Izi

"If I were a shadow, I know I wouldn't like to be half of what I should be."
Mr Otsuka, the old black tomcat in Kafka on the Shore (Haruki Murakami)


RobynsVeil ( ) posted Mon, 01 June 2009 at 5:19 PM

Thank you, BagginsBill, for taking the time to create this remedial introduction into class behaviour using a real-world model for me. It was very clearly done, and easy to understand. I've followed your suggestion and have printed out Chapter 9 of the Python 2.6.2 Manual (the one on classes), as well as your explanation.

It's going to have to be a long coffee break.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


kobaltkween ( ) posted Mon, 01 June 2009 at 8:58 PM

file_432103.png

so this is my result with using (from left to right) GC, sRGB, and the new power method.  same properties, infinite lights, my very bright setup.   i have no clue what i've done wrong to produce the color shift, but i thought i'd post.



IsaoShi ( ) posted Tue, 02 June 2009 at 4:11 AM

Interesting. I couldn't understand why the HSV formula would not result in an unwanted change in colour. But I don't really understand why it would, either!

What focal length lens did you use for this render? I only ask because the views of the spheres are at quite different angles, made clear by the difference in shadow size and the position of the right-hand specular highlight. For a better side-by-side comparison of shading I would be inclined to use a much longer focal length.

Obviously, this makes no difference to us seeing the colour shift on the HSV corrected sphere.

:O)

"If I were a shadow, I know I wouldn't like to be half of what I should be."
Mr Otsuka, the old black tomcat in Kafka on the Shore (Haruki Murakami)


RobynsVeil ( ) posted Tue, 02 June 2009 at 6:48 AM

I carefully studied this script today between patients and during my tea break... one thing became apparent: the order in which you call CSB and create your materials is important - this is looking beyond your CSBSurface(item) function.
This:
skinAndPaint = skin.mix(...)
uses this object:
skin = CSB(...)
Indeed, the order of creation of all of those materials has considerable significance.

I had a ton of questions today as I was reading - lots of questions answered, too! - but by the time I got home tonight and had my daughter's over-cooked pasta (her heart's in the right place - she just doesn't get the concept of al dente) my brain had turned to mush.

So, tomorrow, I'm going to try to implement this... proof's in the pudding. Those questions will most likely resurrect then...

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


bagginsbill ( ) posted Tue, 02 June 2009 at 8:26 AM · edited Tue, 02 June 2009 at 8:28 AM

Quote - so this is my result with using (from left to right) GC, sRGB, and the new power method.  same properties, infinite lights, my very bright setup.   i have no clue what i've done wrong to produce the color shift, but i thought i'd post.

Did you have a blue IBL in the lighting?

GC and sRGB tend to take fractional values and bring them closer to 1. The ratio of the new value versus the old value is smallest when the value is already close to 1 and largest when the old value is near 0.

Knowing that, consider the color RGB .75, .5, .25. These color components are in the ratio of 3:2:1. What happens when you apply GC(2.2) to that color?

You get .88, .73, .53. The ratios are now roughly 1.65:1.37:1.

Roughly speaking, the sensation we call hue is due to which components are strongest and second strongest. There are six bands of hue possible; Rg, Rb, Gr, Gb, Br, and Bg. This color is in the Rg (Red, green) band. Now the particular hue within the band is due to the particular ratio of the brightest to middle value. So the new color has a somewhat different hue than the old color.

Roughly speaking, the sensation we call saturation is due to the relationship between the strongest and the weakest component, not ratio exactly, but we can still understand how saturation is being influenced by comparing ratios. We started with 3:1 and ended with 1.65 to 1. Clearly this is a reduction of saturation.

So, GC shifts hue, decreases saturation, and increases luminance. We compensate for this by applying anti-GC to the incoming colors first, which does the opposite in a way that attempts to balance the hue and saturation effect, while not balancing the luminance effect. The net effect is to increase luminance with a minimal shift in hue and saturation.

The problem with GC is if you do not anti-GC the incoming material to balance the HS effects, you end up with a washed out appearance. Everything moves closer to white. Pastel (weak saturation, high luminance) colors become white, and colors with strong saturation and medium luminance become pastel. So it is really not useful to apply GC alone as a final processing step.

The new technique I showed attempts to preserve hue and saturation, only altering luminance. And since it does, if there is any slight amount of blue in your lighting, and you're testing with a white prop, the blue will tend to disappear after GC, while not so in the "exponential" technique.

By the way, the technique is similar to (or maybe exactly the same as) the tone mapping technique called HSV Exponential Tone Mapping or something like that. I can't find any clear CG-community approved exact mathematical definitions. I can only find verbal descriptions of these techniques in other products, like VRay and Kerkythea. In all cases they talk about adjusting luminance while preserving hue and saturation. It is from those little clues and the words HSV and Exponential that I guessed at how to do it.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Tue, 02 June 2009 at 8:47 AM

but i'm anti-correcting my input. 



kobaltkween ( ) posted Tue, 02 June 2009 at 9:02 AM

hmmmm.  so... maybe the problem is that i'm anti-correcting my input, which is just the base color (no effects), but then correcting the whole of everything, which would include colors from the lights.  that might explain why my weakest VSS image was a low-light environment with colored spots.



bagginsbill ( ) posted Tue, 02 June 2009 at 9:02 AM · edited Tue, 02 June 2009 at 9:03 AM

file_432130.txt

> Quote - RV or BB or anyone... two quick questions, svp. > > I'm learning Matmatic, and I've just written my first Matmatic script, which creates my Adjustable Background Gradient shader (q.v.). It's only a simple 10 node shader, but I'm quite proud of it for a first effort! > > Anyway, I've included Parmatic nodes for the vertical position, spread and Intensity of the background gradient. Is there any way to tell Matmatic to put these nodes in certain (x,y) positions in the node view? > > Also, is there a way to tell Parmatic what sequence to present the parameter dials in? It seems to choose a sequence based on nothing in particular, and won't change it no matter what I do. > > Thanks > Izi

You can explicitly set the coordinates of a node by setting its pos attribute to a tuple containing x,y coordinates.

For example:

color = SimpleColor(RED).labelled("Choose Color")
color.pos =120, 20

However, other nodes may automatically be placed in the same area. Matmatmic chooses positions for nodes automatically based on when it first sees them starting from the surface node and moving outward. It lays them out in columns using a tricky algorithm.

Lately I've been moving the surface over to the right a couple columns. This makes matmatic lay out other nodes to the right of that, leaving the left side empty. Then I position my parameter nodes on the left side.

I do this using a little hack right in the script, where I make a list of parameter nodes and call this hackpos function. It's a bit involved, but easier than manual placement.

I've attached an example script that contains the hackpos function and its helper functions, as well as a simple example.

In the script, I only make one material. If you are making multiple materials, be sure you create a new parameter list (or two of them if you're doing the two-column layout) for each material.

As for the parmatic parameter order - I didn't provide a way to control that. I think it just adds them in the order they are encountered in the file. If it is really important to order them, you can do so by making a pz2 file that creates parameter groups and specifies the order of the parameters. I did this for the AMUCFS.

I never continued development of parmatic, because I plan for VSS to supercede that whole idea. With VSS Pro, you will see all your material parameters in a nice GUI, including the color ones, and we won't have to deal with building parameter dials on the prop or figure.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Tue, 02 June 2009 at 9:04 AM

file_432132.jpg

Here is the generated node layout.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Tue, 02 June 2009 at 9:08 AM

Quote - but i'm anti-correcting my input. 

But isn't your input white? Remember the colors black and white are not affected by any of these techniques. There is no such thing as the anti-corrected color of white - it is white.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Tue, 02 June 2009 at 11:03 AM · edited Tue, 02 June 2009 at 11:10 AM

right.  i realize that, but that sounds like an indicator of the problem, not the problem itself.

so i have white in, and anti-correcting does nothing.  but even if i were working with a color, it wouldn't matter.  because the problem with the former GC and sRGB methods is that the anti-correcting doesn't affect the color of the light on the surface, but the correcting does.   again, this totally explains a lot of the problems i was having with my low-light, colored light image.  i couldn't figure out why i was losing so much saturation.

so, just guessing, but it sounds like the correct comparison would be to take your new technique and compare the other corrections/luminances?



bagginsbill ( ) posted Tue, 02 June 2009 at 11:17 AM

The correct comparison would involve this, I think.

You lost saturation of the light colors, because you didn't anti-correct the light colors! In particular, if you're using an IBL image and the IBL image is a photo, then the colors from that photo have been de-saturated into sRGB space. The true color components are NOT what you're using for lighting.

When using GC, you should use linear light colors, not sRGB light colors. Hahah. So anti-GC the desired light color in the light shader, when you're using GC.

This is what I do when you make an IBL probe using my genIBL tool. That tool automatically does anti-GC when generating the probe, so you don't have to anti-GC it. But if you're using a probe made by somebody else, such as a photo of a mirror ball, that has to be brought back to linear color space. Otherwise, you're not using the correct representation of the lighting captured by the mirror ball.

This means you won't be able to compare in one render. You'll have to render GC things separately because you need the lights to be different.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Tue, 02 June 2009 at 12:47 PM · edited Tue, 02 June 2009 at 12:50 PM

i'm probably not thinking about this right, but that sounds kind of needlessly complex. i mean, it's not just the colors you're anti-correcting in your lights.  it's their luminance, too.  which means you're anti-correcting the light to be darker, then applying anti-correction to the surface input to make it darker, then correcting the surface to make it lighter.  i mean, you could try to do something with then correcting the light's color, but then you're back making the color wrong.  so you'd have to do something wacky with the lights to correct the color but keep the luminance.

it seems more sensible just to keep the lights in correct plain colors/photos.  i mean, an IBL doesn't have any other calculations it's making about it's colors and luminance, right? and the other types of lights are pretty much the same, right?  no transforms within the base?  like, i could see needing to correct the inverse square falloff output, but that's just luminance.

and what you do with the GenIBL might explain the problems i had when using Synthetic's iterative rendering techniques with your GenIBL.  i kept having to tweak the image to make it brighter, and couldn't figure out why i was loosing so much luminance with each iteration.  now i know.

it's much simpler just to keep the hue and saturation shift out of the shader, and use your technique to affect only the luminance.  is there any reason this would be less accurate?  is there a hue and saturation shift i actually want to preserve?



IsaoShi ( ) posted Tue, 02 June 2009 at 1:26 PM

Thanks for the answers re Matmatic and Parmatic, bb.
And for the detailed explanation about the HSV correction - it begins to make a bit of sense.

But I still don't get the bit about multiplying an RGB tuple ( c ) by an HSV tuple ( HSV(c,...) ). Presumably, Poser Python knows that one of these is an 'apple' fruit and the other one is an 'orange' fruit, and it converts one of them before multiplying them together to determine how much fruitiness we have in total.
Hope you see what I mean...

"If I were a shadow, I know I wouldn't like to be half of what I should be."
Mr Otsuka, the old black tomcat in Kafka on the Shore (Haruki Murakami)


IsaoShi ( ) posted Tue, 02 June 2009 at 2:28 PM · edited Tue, 02 June 2009 at 2:29 PM

Sorry, ignore last post... I read through it all again and the light came on.

"If I were a shadow, I know I wouldn't like to be half of what I should be."
Mr Otsuka, the old black tomcat in Kafka on the Shore (Haruki Murakami)


bagginsbill ( ) posted Tue, 02 June 2009 at 2:38 PM · edited Tue, 02 June 2009 at 2:42 PM

Quote - i'm probably not thinking about this right, but that sounds kind of needlessly complex. i mean, it's not just the colors you're anti-correcting in your lights.  it's their luminance, too. 

Yes that's true and it's also the right thing to do. Suppose you photograph (with a mirror ball) walls and a ceiling and the ceiling is RGB(240, 240, 240) and twice the luminance of the walls. Do you think the walls are RGB(120, 120, 120) in the photo? Because they won't be. In the photo, which is sRGB corrected, those half-bright walls will be RGB(175, 175, 175). In sRGB color space, 175 is half of 240. If you use that for lighting, the amount of light from the walls will be stronger than it is supposed to be, and your IBL will have incorrect ratios of lighting from above versus from the sides. If you anti-GC that image, the ceiling will become 223 and the walls become 111.  This is why IBL lighting seems to lack contrast. The ratio of 240/175 is only 1.37, but the lighting ratio is supposed to be 2. That's a huge difference. Huge.

Similarly, if we were looking at a wall color where the amount of red is 240 and the green and blue is 175, that represents a saturation ratio of 2 in linear color space, but converted to sRGB the ratio is only 1.37. That means the light from that area is not as red as it should be, because you're using an sRGB value as if it were a linear value, and that means the luminance is wrong and the hue is wrong and the saturation is wrong.

Perhaps you're not aware how lighting works. Given any point on a surface, a loop is performed. For each light, the loop calculates an effective RGB value for that light source, regardless of whether it is IBL or spotlight or infinite or point. Each of those uses a different technique for finding that value, but it's a simple lookup and a little math. Based on angles, the color you entered (or came from the photo in an IBL) and the intensity multiplier, this value for illumination is arrived at by linear multiplication of several factors. In the case of an infinite light, one of the factors is the cosine of the angle between the normal and the light vector. At 0 degrees (straight into the surface) a light will deliver its full intensity. At all other angles, the fraction will be less than 1. At 60 degrees it is exactly half. In any case, if we let that value be called L, then the calculated linear diffuse color of the surface is Diffuse_Color * Diffuse_Value * L. I'll shorten that to C for Diffuse_Color and V for Diffuse_Value, so the product is C * V * L.

Now think about what happens if  you're doing that multiplication, but L is an sRGB value, not a linear value. You're multiplying apples and oranges now. Gamma correcting that will result in the product of those things raised to the 1/2.2 power.

(C * V * L ) ^ (1/2.2)

re-arranging we get:

(C * V) ^ (1/2.2) * (L ^ (1/2.2)).

But remember that L is already gamma corrected, which means you're gamma correcting L without first having anti-gamma corrected it. This results in double gamma correction. The net effect is too much light, undersaturated, and with the wrong hue.

Quote - which means you're anti-correcting the light to be darker, then applying anti-correction to the surface input to make it darker, then correcting the surface to make it lighter. 

Sorry but that is the math. It's a simple result of this rule for exponentation.

(AB)^P = (A^P) * (B^P)

The power of a product is the product of the powers.

This means that if you're going to gamma correct a product, you have to anti-gamma correct each factor in that product.

Quote - i mean, you could try to do something with then correcting the light's color, but then you're back making the color wrong.  so you'd have to do something wacky with the lights to correct the color but keep the luminance.

No you're not making the color wrong. It was wrong to begin with. The colors in a photograph look right, but the numbers the photo contains are not linear values. It is wrong to use them as if they were linear. The results will be wrong.

Quote - it seems more sensible just to keep the lights in correct plain colors/photos.  i mean, an IBL doesn't have any other calculations it's making about it's colors and luminance, right?

Actually it has a huge calculation right at the start. It does a calculation called convolution. This calculation is to pre-calculate the value of L for every possible direction a surface could be facing. When you do this calculation with sRGB values, you're getting nonsense. That calculation has to be done with linear values. Otherwise, you're adding things together that are not representing the true RGB values as numbers. The resulting sum is way off.

Quote -
and the other types of lights are pretty much the same, right?  no transforms within the base?  like, i could see needing to correct the inverse square falloff output, but that's just luminance.

No it isn't just luminance. Each value of the original color drops by an equal ratio for any given distance. But if you calculate a ratio of the starting color based on the sRGB values, then the saturation changes as you move farther from the light. If the light is gray, no big deal. But if the light actually has a visible hue, then the inverse square falloff calculates completely wrong colors as you get farther and farther from the light. Also, as I said earlier, even an infinite light modulates the illumination color L with the angle to the light. So at 60 degrees, the value of L should be half what it was at 90 degrees. And the components within L should keep their same ratios. If you start with an sRGB color and you vary the luminance with viewing angle, or distance, or anything at all that represents lighting, then there is a multiplication involved, and the only sensible way to do it is with linear values.

Quote - and what you do with the GenIBL might explain the problems i had when using Synthetic's iterative rendering techniques with your GenIBL.  i kept having to tweak the image to make it brighter, and couldn't figure out why i was loosing so much luminance with each iteration.  now i know.

I'm not sure what you did, but there are many other ways you could end up with it getting darker on each iteration. Also, that technique is of very limited use. It is not an accurate substitute for GI. When you have four walls, ceiling, and floor, with one point light inside, you calculate an IBL value for the ceiling area. Now that gets used for the entire floor, not just the center or the corner, but the entire floor, because all of the floor is pointing straight up. IBL only produces graduated lighting for curved things. For large flat things, it's completely and totally incorrect, meaningless gobbledygook.

Quote - it's much simpler just to keep the hue and saturation shift out of the shader, and use your technique to affect only the luminance.  is there any reason this would be less accurate?  is there a hue and saturation shift i actually want to preserve?

All of the above.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Tue, 02 June 2009 at 2:59 PM · edited Tue, 02 June 2009 at 3:12 PM

file_432181.jpg

Here is a demonstration using the inverse square falloff pointlight.

I put two point lights 80 inches above the ground here.

The ground has a diffuse-only GC shader. (Raises the output of diffuse to the power 1/2.2). The color is white. The diffuse value is 1.

So directly under the light, at 80 inches, the falloff multipiler is exactly 1. Farther from that it goes down.

The light on the left has the sRGB color in it as usual. I used RGB(255, 128, 64). This is supposed to be a strongly saturation orange and that's how it looks on the color picker. You may think that 255/64 (saturation ratio) is not strong (high) but remember that the 64 is actually much less than 64 in linear value terms. It is 64 ** 2.2, which is a linear 12. The saturation ratio is actually 21 to 1, even though the sRGB numbers are only 4 to 1.

The light on the right has the the gamma corrected RGB(255, 128, 64) ** 2.2 value in it.

Which one is lighting my ground with the color I picked on the color picker?


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


bagginsbill ( ) posted Tue, 02 June 2009 at 3:07 PM

file_432182.jpg

Here they are now spotlights. I added some balls.

The luminance changes are correct in both cases because we're using the correct ISF and GC shaders on all props. But the starting color is just wrong on the first light. It is not the color I selected in the color picker.

You have to anti-gamma correct the color of your light or that's not the color you'll get.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


kobaltkween ( ) posted Tue, 02 June 2009 at 3:58 PM

wow.  i didn't need all that, but thank you.  all you had to say was a) there is a hue and saturation shift, and b) no, the lights' output needs to be linear because (if i'm interpretting what you said properly, please correct me if i'm wrong) you need linear info for the interaction with nodes on the surfaces.   makes perfect sense.  the part i was missing was the information about how lights work internally, which is seriously interesting and i thank you for sharing it, but it would have made sense just as a black box of equations.

that said, you're basically making an argument for the simplified correction not working, either.  because if it won't work to just do it at the end for GC or sRGB correct, why would it work to do the same thing with your new equation?  i mean, if i'm not misunderstanding the equation, translating it to a GC equivalent would just be changing the value of the exponent.  i interpretted your statements about the new equation as not affecting hue and saturation was at least a fair approximation of anti-correcting first, if not more correct.  am i misunderstanding?

just given how similar the curves are, i'm not sure i see why it would be so different for (higher values) to use the new equation vs. either sRGB or GC.

as for iterative rendering, what you seem to be saying, and please correct me if i'm wrong, is that non-uniform IBL doesn't affect the a flat surface realistically.  which affects just about every use of IBL, not just iterative rendering techniques.  i mean, great, if you're just using entirely photo backgrounds.  but not great if you need to light anything with a flat ground and non-uniform light.  which is a whole lot of uses of IBL.  i can't speak for anyone else, but that sounds like a bigger problem than the iterative issue.

i would say that if you think light correction is such a big deal, you should be much more vocal about it.  i would guess that almost no one following your advice is correcting the lights.  i've never seen anyone mention it, at least.  and, if i'm understanding properly, you're talking about just anti-correcting the lights.  that is, you need the output of the light's base node to be linear, not corrected.  it seems like a really drastic change to the lighting.   you don't really work artistically, so that shift wouldn't be a big deal to you.  but as an artist who needs to eyeball to acheive what i'm visualizing, i'm going to have to think hard about what this means in terms of how i should execute lighting.  that is, how what i see in the material room and maybe use in parameters matches what i envision.  it  might mean i shouldn't change much.  but i'm not so sure of that.

oh!  just to ask because i'm at work and don't have Matmatic documentation here, are lights' base node elements controllable in Matmatic?  could you show a quick example of how to make a simple light material?

oh, and a more general question: can we ever distribute materials based on what we're learning here (and in other threads)?  i mean, proper accreditation goes without saying.  i'm really less worried about commercial than free, but it would probably be good to clarify in general.  it's just kind of weird to be caught between thinking, "wow, every freebie and product should have these features," and, "i can't distribute this because it's not really mine."



RobynsVeil ( ) posted Tue, 02 June 2009 at 4:05 PM

Quote - oh, and a more general question: can we ever distribute materials based on what we're learning here (and in other threads)?  i mean, proper accreditation goes without saying.  i'm really less worried about commercial than free, but it would probably be good to clarify in general.  it's just kind of weird to be caught between thinking, "wow, every freebie and product should have these features," and, "i can't distribute this because it's not really mine."

This is a burning question to me as well. I suppose we could either offer more opinions here or could take it up with Those-That-Know in the Copyright Laws and Ethics Forum:
www.renderosity.com/mod/forumpro/showforum.php
It would be nice to have an definitive, authoritative answer on this.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


kobaltkween ( ) posted Tue, 02 June 2009 at 4:21 PM

oh, i didn't mean in general in a legal sense.  i meant in general in a bagginsbill's choice sense. 



bagginsbill ( ) posted Tue, 02 June 2009 at 4:49 PM · edited Tue, 02 June 2009 at 4:50 PM

I've said before, anything I write in a forum thread that is a shader or a script that generates a shader is easy. You do whatever you want. Even sell it to noobs who don't know better that they could just read what I said.

However, if you manage to make some serious cake by selling noobs my free stuff, I'd like a cut. What I mean by that is not including it in a larger product, but just taking a posting of mine and sellling as is. Then I want a cut.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


RobynsVeil ( ) posted Tue, 02 June 2009 at 4:56 PM

I think that's only fair, Bill. Since I haven't made even dollar one on any of what I've put together, the point is a bit moot, but I'll put aside a Bill Fund to salt away a percentage of what I do make. When I start seeing some income from this, I'll be asking for your details... if that's okay.

Monterey/Mint21.x/Win10 - Blender3.x - PP11.3(cm) - Musescore3.6.2

Wir sind gewohnt, daß die Menschen verhöhnen was sie nicht verstehen
[it is clear that humans have contempt for that which they do not understand] 

Metaphor of Chooks


ice-boy ( ) posted Tue, 02 June 2009 at 5:05 PM

we have to anti GC lights?


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.