Forum Moderators: Staff
Poser Python Scripting F.A.Q (Last Updated: 2024 Dec 02 3:16 pm)
An added complication (as if learning quaternion maths from scratch wasn't enough) is that I'm using scaled scenes, which instantly de-normalise (i.e. make non-unitary) the transformation matrices and quaternions, which have to be normalised (essentially divided by their magnitude) before the incorporated angles can be extracted. The best reference I've found so far (from a google "quaternion" search) is: http://skal.planet-d.net/demo/matrixfaq.htm Has anyone played with this stuff to the point they can usefully share their knowledge?
Verbosity: Profusely promulgating Graham's number epics of complete and utter verbiage by the metric monkey barrel.
I hate it when someone says, "No", as a reply, but "no, I haven't found the answer".
However, I did want to express an interest in the question.
I want to understand this for what I believe might be a separate issue, but I don't know enough about it to even be sure. What I want is to have a REAL "point at" function so that objects will continually face the camera. That way the object doesn't rotate along it's new 'z' axis as it continually points towards the viewer.
If this is unrelated and perhaps belongs in a different thread, my apologies. Just let me know. I am "hanging on by my fingernails" here when it comes to "local space" and "world space" and "quaternion" and stuff like that..
I made 'A's and 'B's in Calculus, but it was more than 20 years ago! (:-D)
===Underdog===
My python page
My ShareCG freebies
underdog, my feeling is that the quaternion, which expresses a (fourth) rotation about an axis is exactly what you would want to manipulate to counteract the "rolling" effect, provided the quaternion axis is coincident with the object origin to camera (or point-at target) vector. You might have to experiment and/or do lots of research to understand the relationship between the quaternion rotation angle theta and the world orientation of the camera. I'm entirely unconvinced it could be as easy as resetting theta to zero without touching the axis (in this specific case) but that shouldn't be too hard to test.
Verbosity: Profusely promulgating Graham's number epics of complete and utter verbiage by the metric monkey barrel.
I used to think I was smart. Now I am not so sure. I am so far out of my depth on this that I think I will just smile and act like I understood some of this. I had 3 semesters of engineering calculus in college, but that was 20 years ago, but since then I have done nothing with it. I am pretty sure I remember hearing about a "theta" at one time. Hmmm. Anyway, I will get out of the way now. Good luck with your gravity based morphs.
hey ockham! Is this the same kind of problem you solved with your Jiggles script? I seem to recall the target MTs (or other parts) were influenced by the movement (including acceleration) of a "host" body part in absolute space. But I'm not sure if this is the kind of thing gwhicks is after. Anyway, tossing this out in case it rings any bells, strikes any chords, etc, etc. cheers... TG
Tguyus, I looked at ockham's jiggles script (though I can't run it being on a Mac, dagnabbit). What I'm looking for here is an instantaneous gravity vector angular measurement relative to the default orientation of the body part or prop. That doesn't require calculations of first or second order derivatives for velocity and acceleration or twang/damped oscillations. Specifically, I need to determine by observation, which axis of rotation of an actor relative to world coordinates should influence a specific morph target. Imagine a static hair prop with morphs for side-side and front-back swing/sway. The front-back swing needs to be controlled by the head bend (x-rotation) if rest of the figure is in its default orientation. If you turn the figure 90 degrees left or right, which rotation axis components do you use to determine the head's attitude relative to the default position? If you then lie the figure on its side (still facing left or right), The head's axis (origin to endpoint vector) may lie in the global YZ plane with a global X rotation of +/- 90 degrees (approx) but no front-back swing of the hair would be expected, side-side would be involved instead.
Verbosity: Profusely promulgating Graham's number epics of complete and utter verbiage by the metric monkey barrel.
When developing Jiggles I tried to solve this problem but couldn't! So I just limited gravity to up-down motion. The deeper problem for a generalized script is that there's no good way to tell which morph serves which purpose, so the user will have to cut and try so much that the script doesn't save any labor. But if you're going to match the script to one hair-prop with a known set of morphs, you can do the experimentation once and hard-wire the relationships. Possible idea: The last SR of Poser 5 has a new function that could help. Each facet of the mesh can give you a WorldNormal. So you could find in advance a "key facet" for each adjustable section of the hair, and its WorldNormal when the section is maximally affected by gravity. Then during action, you can check its WorldNormal at each moment, and adjust the morphs oppositely to bring that facet back toward its "down" position.
My python page
My ShareCG freebies
My python page
My ShareCG freebies
Thanks ockham, that's a great start! I've posted a modified version in your latest Jiggles thread with multi-frame influence of the six most appropriate, gravity influenced, breast morphs controlled by the chest morph values, but analysing the collar morphs for their effect. Pity the analysis takes so long per frame. As you're saving analyses to files, maybe the analysis of the average influence vector for each morph, body part combo could be converted back to the body part's local space and saved to a file. The appropriate morph vector files could be read and converted to worldspace for each frame's morph setting, speeding up the whole process.
Verbosity: Profusely promulgating Graham's number epics of complete and utter verbiage by the metric monkey barrel.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Has anyone attempted to make use of (let alone understand) the actor local and world quaternions exposed in the Python interface? As explanation, I'm investigating ways to identify the angular change in attitude of body parts and props with respect to the poser coordinate system in order to automatically control morph targets to simulate gravity effects. I'm being stymied so far by lack of A) Poser relevant documentation (i.e. how the actor.WorldMatrix() tuple is ordered, B) synaptic disaffection due to a severe throat infection, and C) suspicion that there is no Python interface to the actor rotation ordering (XYZ or YZX, etc.) necessary to correctly interpret quaternion coordinate systems. My university training in linear algebra and spherical astronomy is just too long ago to make the appropriate path obvious [ah, the perils of age and a serious sinus headache :-( ]
My ShareCG Stuff
Verbosity: Profusely promulgating Graham's number epics of complete and utter verbiage by the metric monkey barrel.