Sat, Mar 7, 9:15 AM CST

Renderosity Forums / Poser Python Scripting



Welcome to the Poser Python Scripting Forum

Forum Moderators: Lobo3433, Staff Forum Coordinators: Anim8dtoon

Poser Python Scripting F.A.Q (Last Updated: 2026 Mar 06 2:31 pm)

We now have a ProPack Section in the Poser FreeStuff.
Check out the new Poser Python Wish List thread. If you have an idea for a script, jot it down and maybe someone can write it. If you're looking to write a script, check out this thread for useful suggestions.

Also, check out the official Python site for interpreters, sample code, applications, cool links and debuggers. This is THE central site for Python.

You can now attach text files to your posts to pass around scripts. Just attach the script as a txt file like you would a jpg or gif. Since the forum will use a random name for the file in the link, you should give instructions on what the file name should be and where to install it. Its a good idea to usually put that info right in the script file as well.

Checkout the Renderosity MarketPlace - Your source for digital art content!



Subject: Moving morphs between different figures


Spanki ( ) posted Mon, 15 January 2007 at 3:52 PM · edited Mon, 15 January 2007 at 3:54 PM

Just to clarify, while the point intersection problem is much better now, I've still seen it with other sample meshes, but yes, the edge intersection issue is more of a problem currently.

The problem with that code I have commented out (and the link to the code you showed me) is that they were originally written for 2D operations.  In 2D, a cross-product produces a single value, but in 3D, it produces a vector.  I just haven't looked to see if that code will be needed yet, so I didn't bother trying to fix it.

The weighting code will need parts of that (it needs to get the intersection of ray->edge), but the rest should be easy.

As for things you could work on,  Hmmm...  one thing that is not necessarily apparent on simple props but that I think is still going to need to be done (for more complex meshes) is what I mentioned earlier (it might be back a page or so now) about allowing for 'multiple' hits and then taking the best (closest distance) one.

Currently, on things like that head, there are rays cast out of the ears/nose/lips at odd angles that might well get a 'hit' on a cheek somewhere - within the testing region.  And since we stop looking once we get a hit, It just depends on which polygon gets tested first.

So we're either going to have to:

  1. sort the test polygons for every vertex
  2. or allow for more than one hit and track the distance to each one and then take the best one.
  3. only test polygons within some range of the vertex

...option 3 might be the easiest/fastest approach, but does make some assumptions about how the meshes correlate to each other.  Anything we do here is likely to tack on some processing time, but I'm not sure how that can be helped.

More divisions might also help, except that that conflicts with the current 'longest-edge-length-as-region-expansion' implementation at some point (regions can't be any smaller than plus or minus that value).  At the same time, the smaller you make the regions, the more likely you are to run into that region-spanning-polygon issue to start with - catch 22 :).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Mon, 15 January 2007 at 4:02 PM

"...option 3 might be the easiest/fastest approach,"

But might not be the best approach... this could still pick up false hits.  The best 'assumption' to make about the correlation between meshes is probably that the 'closest' hit is the best one, which gets us back to method 2 or 1.  I think I'd favor 2.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Mon, 15 January 2007 at 10:59 PM

file_365701.doc

Okay.  I made a number of changes, which are listed at the bottom of the script.  Most of them pertain to implementing Numeric.  I've put Numeric.equal in to check your vertex matching case.  It seems to work nicely, but you have a better background to judge that sort of thing.  The normals check I put in to compare vertex and triangle normals seems to help speed things up, and I did use a version of your number two option, above.  The RAM loss seems to be less extreme now.  I think a lot of that comes from not going through the edge bounds check as often.  I think we may be better off if we avoid that as much as possible.  So the vertex check breaks the poly loop before the bounds check.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Mon, 15 January 2007 at 11:30 PM

Thanks.

Hmm.. I've actually got some questions/comments about several of those changes, but I'm busy with another project at the moment.  I'll try to test this some before I hit the sack, or tomorrow.

Just a quick observation for now... 

  • Converted longestedge to use Numeric array check

...and increased the computation time.  The edgelengths() routine was not being called in the previous script and now it's only being called to build an array that the longestedge() routine scans through.  The way I had it before, the array (and storage needed to store it) was not allocated at all.

In other words, the current version uses all the time the old version did to scan through determining edge lengths, but now insead of being done, it also:

a) allocates memory to store the list.
b) scans through that list (using the Numeric function) again
c) just frees it (we hope) afterwards

...methinks you may have overthought that one 😄.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Mon, 15 January 2007 at 11:39 PM

Actually, I thought about asking about the usefulness of that one and forgot to make a note to myself about it.  Hmm.  I consider the normals test in linecast to be more important....

I guess the question is whether we'll need the edgelengths later in the process.  If we don't, then what you say is completely correct.  Either way is fine with me.

I'm not wed to any changes I've made, and I don't claim any of them are definite improvements.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Mon, 15 January 2007 at 11:42 PM

...at first glance, I'm also concerned about this addition at the top of the loop in linecast()...

if Numeric.dot(vnorms[vi],pnorms[pdx]) < 0.9: continue #Compare the normals

...I don't think that's a valid thing to do.  If you look right below there, you'll see that this same dot_product is already being computed as the ndota variable.  And then there's already a test below that...

if ndota > 0.0:
    do the bounds check stuff here

...so we already have what I believe to be the correct test, but you've also excluded the range of values from > 0.0 through < 0.9, so the only time you even get to the code below there is if the value is >= 0.9

I'd be a little surprised if this is working correctly, but I'll test it when I get a chance.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Mon, 15 January 2007 at 11:43 PM

Heh.. cross-posted... is that the normals test you were referring to?

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Mon, 15 January 2007 at 11:45 PM · edited Mon, 15 January 2007 at 11:47 PM

That's the one.  I have no idea whether that's valid or not.  I am, as I point out, math-stupid.  :-P

Yet it seems to work, and things are faster....

Hmm.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Mon, 15 January 2007 at 11:48 PM

It's not at all surprising that it speeds things up - if it turns out to be valid, then great! 😄

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Mon, 15 January 2007 at 11:58 PM · edited Tue, 16 January 2007 at 12:02 AM

..Just to clarify a bit... what that dot product between normals does is tell you something about the angle between them...

1.0 - they are exactly the same direction vector
0.0 - the two vectors are 90deg apart (perpendicular to each other)
-1.0 - the vectors point in opposite directions.

...so in this case, we have two direction vectors (normals).  The first is the normal of the ray being cast and the second is the normal of the polygon we're trying to hit.

Since the polygon normal is perpendicular to the polygon itself, if the result of the dot is 1.0, then the ray and the polygon are perpendicular (pretty good chance of a hit).  If the result is 0.0, then the ray and the polygon are paralell, so they'd not intersect (unless the ray happened to be on the same plane).  Anything less than 0.0, means that the ray would hit the wrong face of the polygon.

So... with your >= 0.9 check, you're excluding all other angles between paralell and 90% perpendicular.  It could be that your test-case is working, but I'm pretty sure that's not going to work with different sets of meshes.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 12:18 AM · edited Tue, 16 January 2007 at 12:22 AM

I think I misunderstood.  A page or two ago, I thought you mentioned using a normals comparison to screen out bad in-bounds positives.  I got the idea that this should be run before the in-bounds check, to actually screen out the positives before we waste time on them.  Foolishly, I didn't realize I was duplicating ndota.  D'oh!

But by raising the bar on ndota a and removing the break in the loop when we find a match, I've gotten speed gains and reduced RAM loss without any apparent loss in matches.  But the question of whether it really does present a problem for generating correct matches might be something you'd be better at determining.

In theory, is there a definite need to check bounds BEFORE we determine whether the triangle needs to be rejected because of mis-matched normals?  That's where I'm apparently confused....

Hmm, hmm....

Another cross-post.

I see.  I'm like Homer Simpson in the Mr. X episode, when he's trying to teach himself to use the computer.  Hmm.  I knew the dot returned an angle, but I think I mixed up the parallel and perpendicular.  So... how would one "raise the bar" on the ndota normals check???

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Tue, 16 January 2007 at 12:39 AM

Yeah, I was confused about the check as well, until I thought about it some and realized why a value of 0.0 meant 'paralell', when what it should mean is 90deg.  The answer to that is that we're testing the normal of the plane and not the plane angle (and the normal is perpendicular to the plane).

Anyway, I actually just changed that test in the last version I sent you... it used to just test for 0.0, but now it tests for negative values as well, so it's now throwing out polygons that face the wrong direction (before getting to the bounds check).  I think that's as far as we can raise that bar.

If you test with 2 cubes, I think you'll find that the 0.9 test you have in there right now will cause no matches to occur at any corner.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Tue, 16 January 2007 at 12:45 AM

...arguably, extremely steep angles could maybe be thrown out too (unless you're working with 2 knife blades or something), but due to the unknown nature of the relative shapes of meshes, it might not be safe to raise that above 0.0 (anything beyond paralell is a potential for a hit).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 1:16 AM

Aha.  I have failed in my assignment, then, as I understood it.  :(  My apologies.

But it tested well with my cubes and spheres and heads and bodysuits.  I guess this is the trouble with my necessary approach to such things.  I lack the theory and experience to really know what's happening, and I rely on test results which aren't necessarily telling me what I think they are.  Hmm.  

So that "update" may actually contain almost nothing of worth.  Perhaps I should re-read this thread to try to get a handle on things.

Hmm. 

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Tue, 16 January 2007 at 1:57 AM

Ah.  The cube prop with which I tested was split at the hard edges, so it worked.  I tested with a welded cube, and the test failed, as you predicted.  Feh.

It's a drag, being a bit slow.

Here's my question, then: is there any feasible way to compare the vertex normals and the triangle normals before we launch into testing every triangle?  To see if they match up within a tolerable range somehow?

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Tue, 16 January 2007 at 3:15 AM

Uhm... that "if ndota* > 0.0: #normal is not parallel to plane*" line is doing just that.  I don't know of any way to test each vertex normal against each polygon normal except to loop through them and do it - which is what is being done 😄.

Having said that, I'm sure there are other approaches that would speed things up.  Let's take a step or two back...

  • The region-spanning polygon problem.

This is only a problem in the case where it happens to be the polygon that we need for some vertex to test against.  In all other cases, it's just adding to the number of poygons to test against.

  • The false/multiple hit problem.

This is only a problem based on the order polygons are tested in and the more polygons we test, the more likely we are to find false hits.

  • The vertex normal and polygon normal should be facing similar directions problem (make sure we're not hitting (or even testing) a polygon that's facing the wrong direction).

This used to be a problem, but should now be fixed (as of the previous script).

So... in general:

a) we want to minimize the polygons being tested against.
b) And preferably starting with the closest polygons.

Once we get to hi-res meshes, the typical situation will be that the matching polygon will actually be found in a fairly confined space near the vertex - within only a few polygons, in fact.

I wish I had a working knowledge of binary trees, because it seems like that would be a good approach - I've just never worked with them before (odd as that seems).  In lue of that, another approach would be to create multiple sets of regions...

Let's assume that the largest region we should have to deal with is the 2-division one we're using now.  The idea is basically to create that one (pretty much just how it's being created right now, with the longest-edge-length inflation/padding).  But then create a 4-division region set as well (with little or no inflation).

The linecast() code would start with the 4-division set and use the current 'multiple hits, find closest' code in place.  Then if there's no hit, it jumps up into the 2-division set and either just takes the first hit it finds, or does the multi-hit-closest again.

By starting with a smaller set of polygons, we accomplish a better solution for both a and b goals.  We only jump up into the larger set if we fail at the smaller set.  Note that this also reduces the chances of false hits (we're starting our tests with the closest, most likely to be correct polygons) and gets rid of the longest-edge-length overhead until/unless it's necessary.

A couple other notes/observations about implementing this...

  • many of the 4-division regions may be 'empty' (inside the mesh somewhere), but that doesn't hurt anything.

  • if you had to jump up to the 2-division set, you'd probably want to exclude the polygons that you already tested in the 4-div pass... you could either make a 2-div sized table and mark them off as you test, or just check for existance in the appropriate 4-div region list before (re)testing the 2-div polys.

  • if you're concerned with memory for the extra tables, you could even just make the 4-div ones to start with.  When done, if the xpolys list has any misses in it, delete the 4-div set and make a 2-div set and run through that to just pick up the misses.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Tue, 16 January 2007 at 10:10 AM

Attached Link: http://www.mpi-sb.mpg.de/~blanz/

Looks like we need this guy on our team :).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 3:51 PM

Two ideas.

  1. Set ndota high, but keep a pre-processed list of "hard edges" in the source geometry, where the polygons join at sharper angles.  Check a poly against the hard edges list to determine whether to use the higher ndota settings or lower ones.  Perhaps this could be varied to allow some graduated handling depending on the "sharpness" of the edge join.

  2. Change the loop structure with the polygon loop.  After finding "point", create a bounding box around point and check polygons within that box.  If no match is found within the box, have some expanded backup handling.

Probably both bad ideas, right?  I'm really trying to puzzle out a solution, here....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Tue, 16 January 2007 at 4:57 PM

I don't think #1 wouldn't buy you much.  I suggested trying a cube as an extreme (yet common) example, just to illustrate the problem.  The larger problem is that we don't know ahead of time what relationship the two meshes have to one another.  If it's two of the same mesh, then you could do something like that (then again, you wouldn't need this script if it was two of the same mesh :)).  But if it's two difference meshes, what you need to know is how the angles of the polygons on one mesh relate to the vertex normals of the other mesh - somethng we won't know until we look.

I'm not sure I follow you on #2... the spot in the code where 'point' is determined is... specific to the particular polygon being tested (which, you may not have seen this in action yet, but the point may be 32 yards off in space away from the meshes).

Did you see my 2-div/4-div suggestion above?

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 5:38 PM · edited Tue, 16 January 2007 at 5:41 PM

"Did you see my 2-div/4-div suggestion above?"

I did see it.  If it's feasible, I like it.  I can't determine whether it's feasible.  Do you think it's something worth starting?

My number two wouldn't work as described.  The basic idea would be to create a separate bounding box region around the vertex or the point of intersection, to try to screen the polys we actually check for that vertex.  I'm not sure whether my suggestion would be any better than trying to sort polygons by distance for every vertex.  We'd have to fill the bounding box each time....

Hmm.  Hmm, hmm.  Well, I wrote code to find neighbor polygons and hard edges, anyway.  :-P  Maybe we'll need it some day.

If you are suggesting trying the 2-div/4-div, what can I do toward getting it together?  And what, if anything, can be kept of the mess I made yesterday?  Do you think the use of Numeric is worth keeping?  Obviously the changes to longestedge and ndota (and presumably the break in the linecast loop) need to be changed back.  I'm not sure what I'm using as the WIP script, now.

*"Looks like we need this guy on our team :)."

*Hoofa.  Ut.

Ut!

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Tue, 16 January 2007 at 6:13 PM

I was hoping to get a chance to play around with your latest changes today, but another job has me occupied.  So far, I haven't made any edits to the last thing you posted, so whatever you have is the latest.

I think the multi-div thing is feasable.  All you'd really need to do to try it out in rough form is...

  1. change numdivs to 4  and don't do the inflate
  2. At the end of linecast() where you loop through the xpolys list... you know which vertices didn't get a match (if any), so we can use that list (with a similar "if xpolys[p] == -1" test) or build another list from that.
  3. rebuild the region lists with numdivs at 2 and do the inflate this time (might as well)
  4. maybe make a copy of linecast() and strip out everything you don't need for the second pass (it only needs to check vertices that failed on the first pass).

In theory, the 4-div first pass should be 4 times as fast as the current 2-div pass (it's checking potentially 4 times fewer polys per vertex), but that will depend a lot on the layout of the mesh and the relative polygon size.  If the 4-div pass can pick up the majority of the hits, it should be a nice overall speed up, even though we're making two passes.  ie. if the 4-div pass can catch 1250 hits and only misses a dozen or so, then the 2-div pass is only run on the dozen misses.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Tue, 16 January 2007 at 6:19 PM

As an aside... if you get a second linecast() type routine written to only fill in the blanks, you could try out your hi/lo ndota test as well (try it set high, and set it back to > 0 for the second pass).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Tue, 16 January 2007 at 6:22 PM

Actually, what I'd do is split the region-building/freeing and the reporting code off from the looping code of linecast().  Then just pass linecast() the vertex list to check, and the region/normals/etc lists.  Have linecast() return the xpolys list to indicate what it found.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 10:03 PM

So split up linecast to re-use the loop portion.  Sounds good.

The high ndota settings really were fast and it at least looked like it returned complete and correct matches for all the meshes except the steep-edges cube.  I assume something like that could be useful somehow, even if only when a user specifies the option, knowing that the meshes would work.  In JoePublic's case, for instance, the meshes would be adequately similar and this would be faster going both uphill and down, if my tests were showing me what I think they were.

But I'm perseverating on a point which is non-essential.  :)

I'll test the multi-div idea.  I'm trying to let it settle in my mind, this time, so I don't go off half-cocked again.  :-P

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Tue, 16 January 2007 at 10:42 PM

Of course it was fast - you were culling out 90% of possible valid angles :).

Maybe make a version that just prints out the ndota value when a match is found (or found below 0.5 or whatever).  Then load up V3 and V1 and let it run while you sleep (hehe).  When you get up in the morning, see how many were below that threshold - you might be surprised (or, I will - either way is fine 😄).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Tue, 16 January 2007 at 11:52 PM · edited Tue, 16 January 2007 at 11:55 PM

file_365814.doc

This one runs two passes, the first with the high threshold, the second with the regular threshold.  The first pass runs fast and seems to catch most cases in my tests.  The second pass catches those remaining hard edges.  I haven't tested this extensively, but it looks like it returns the best of both worlds between my results of last night and the regular results we've been seeing.  That is, assuming it's returning "true" matches....

I set this up as you (seem to have) suggested, before testing the more complicated multi-div idea.  I have no idea whether these are encouraging results, or not, however.  I seem to be wrong quite often.  :-P

I'm still foolishly hoping a comparatively simple idea may have merits....

It also sets up conditions to use either your longestedge loop or the edgelengths array to find the longest edge.  If it emerges that we do need the edgelengths array for something else, that may actually be useful.

And it contains the semi-developed hard edges check.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 17 January 2007 at 12:22 AM

Cool.  I'll try to take a little time to play with that tonight.  It's not necessarily what I'd label 'foolish', btw.  It's a nice speed up if if works.  I'm just reluctant to rely on it until more testing with real-world cases.  The multi-div approach will also  cull out huge portions of polygon tests (on hi-res/small poly meshes, at least).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Wed, 17 January 2007 at 1:30 AM

Hmmm.. this looks suspect:

if point_distance(match,v) <= point_distance(point,v):

...I think the logic around that test might be wrong.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 17 January 2007 at 1:33 AM

*"...I think the logic around that test might be wrong."

*I wouldn't be a bit surprised if it were....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 17 January 2007 at 11:16 PM

file_365944.doc

Ok, I've made some changes.  I am in the process of working on the weighting code, but I wanted to get this update to you before I go tearing things apart :).  Don't worry about optimizing anything inside the linecast_loop() or check_bounds() routines until I figure out how the weighting code needs to integrate.

BTW, I ran this on the head again (about 1250 verts and polys) and the time went down from ~90 mins to ~30 mins.  I then disabled 'testing' and it dropped to ~18 mins.  So we've made improvements, but I'm not too concerned either way until we get the basic functionality working as expected (reliably).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 17 January 2007 at 11:25 PM

Err... does that mean my loopy effort with the higher setting for ndota actually helps somehow?  Umm.

I was testing earlier, and I note that the organization of the data at the start can be quite slow with heavy meshes.  I'm not sure what can be done about that.  I loaded the v1 and V3 heads, and waited fifteen minues.  With V3 as the source (polygon processing) the bounding boxes weren't even created in testing mode after fifteen minutes.  With V1 as source, the bounding boxes were created after a couple of minutes, but then Poser put me on hold again while processing V3 for only the vertices.  So I'm a bit distressed, wondering if any of this will even be able to handle the meshes which are ultimately intended to be processed....

18 minutes?  Still rather long, but better.  I wish I had some idea how to optimize all of this.  I wish I had some idea whether Python will even be fast enough to cooperate, in the end.

"Don't Panic," I tell myself....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 17 January 2007 at 11:38 PM

Speaking of the weighting code...

Just to recap,

Source Mesh - the mesh containing the morphs we want to transfer into...
Target Mesh - the mesh that will have the morphs created in it.

...so far, what we've been doing is determining the surface (triangle) of the source mesh that correlates to the vertices in the target mesh.  The plan is to come up with a 'weighting' value (0.0 - 1.0) for each of the vertices of that triangle.

I'll have to go back several pages to refresh my own memory on 'how' the weighting is computed 😄, but the results of that will be the correlation data.

I think what we want to end up with for correlation data is - for every vertex in the target mesh, a triplet of vertex indices (integer vertex indices that reference the 3 points of the triangle in the source mesh) with corresponding triplet of weights (floats, one per index).

I'm still a bit fuzzy on this python stuff, so I'm not clear if we should (or can, even) have a multi-diminsional array with ints/floats mixed, or just simplify it by using 2 arrays.  Using 2 separate arrays is easy enough to deal with in the code, but I'm not sure how it affects reading/writing to a flatfile... opinions?

[I'm still not convinced that reusable correlation data can be generated, btw... but I don't have any problem working towards that goal 😄 ].

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 17 January 2007 at 11:46 PM

It seems to me that we'll need a point on each edge of each triangle, too, which is where I expected to use the edgelengths array.  Does your min_line_distance make that unnecessary?

A Numeric array will not allow the mixing of ints and floats, as far as I can tell.  A homemade array using lists would permit it.

I'm not sure what you mean by "flatfile"....

A triplet of weights?  Now I'm confused.  I thought we just wanted one weight value per index, which would be used to adjust the delta for the vertex.  I'm used to thinking of weights as I've encountered them for bending in Blender, so I may have misunderstood you at the outset.

I'll have to understand what you're planning a bit better before I can grasp what might or might not work....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 17 January 2007 at 11:48 PM

cross-posted...

Yeah, as I mentioned, I'm really not overly concerned with optimizing it until we're getting reliable/valid results.  It doesn't do any good to throw and chop up code to speed things up if the results aren't valid.  Hell, if the generated correlation data IS generally reusable, then I don't care if it takes 8 hours (while I sleep).  Once we have the data, the morph-generation part of it will literally just be seconds.

And yes, your loopy idea was a good one :).  You'll also get faster results by changing numdivs to 4 instead of 2 (my results above were with 4-div regions).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 17 January 2007 at 11:52 PM

Numeric is supposed to be faster for math functions, and I wonder if converting the new min_line_distance to Numeric might help at all.  Dotproduct and vector subtraction and division are all available in Numeric....

So what about your multi-div concept?  Should that be implemented as well, or is it now unnecessary?

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 12:02 AM

Each vertex in target mesh will have data about which surface (triangle) it intersected.  The code does this currently.

The next step is to:

  1. Store a list of the vertices that make up that triangle
  2. Compute and store a weight value for each vertex of that triangle (each of those vertices in #1).

...this information will be in memory, but we could also write it out to a text-file, if it turns out that it's valid (ie, if it's valid with the worldspace flag set to 0, then it only ever needs to be computed once, for these two meshes.  If you have to use worldspace and manually scale/position the meshes to make the data valid for some morph area (like nose morphs), then the data has to be computed multiple times for these two meshes).

So, given the data listed in #1/#2 above, we can now transfer morphs from the source mesh to the target mesh...

for each vertex in the trget mesh
    lookup the 3 vertex indices from #1
    for each of those indices
        if morph delta exists for this vertex in source mesh (sourcedelta), for this morph
            delta for this vertex in target mesh += sourcedelta * weight_for_this_vertex
     

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 18 January 2007 at 12:09 AM

The multi-div idea should still help, but different than I first recommended...  instead of doing a 4-div then a 2-div, what we need is a 4-div without inflate_by_longest_edge and then a 4-div with inflate_by_longest_edge.

It occured to me that we don't really need to jump up to the next div size, we just need to include the inflate for the second pass - that way we're still only checking a small number of polys in the second pass, but we make it big enough to account for the region-spanning issue.  I just used 4-div as an example (it's reasonable), but I'd leave that as a user-defined thing... it might need to be 3 or 2, or conversely 5 or higher might even work (the higher, the faster).  Just use the same value for both pases, but don't add the inflate until the second pass.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 18 January 2007 at 12:10 AM

Can't you just add the index to the array as a float, then convert it back to an integer when it's referenced later?  Presumably this would be better than making a separate array just due to the numeric type.

*"...then the data has to be computed multiple times for these two meshes)."

*Can you explain?  (I seem to be getting dumber as we go; sorry.)

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 12:31 AM · edited Thu, 18 January 2007 at 12:34 AM

Hehe... first, yeah - we could just store the index as a float if there's some way to convert it back before using it (?), or if it doesn't matter.

re: re-computing the data multiple times for these two meshes.

This is probably best explained by... remember how you started implementing this in your earlier scripts.. you found the closest vertex and stored that in a file (your correlation data).  Once you had the file, you didn;t have to recompute it anymore.

Then I came along and my stipulation is that it will be difficult to generate 'good' correlation data (using local space), due to differences in the shapes of the meshes.

If part of the neck is included on the head of one mesh, but it's cut sharply at the chin on the other, then the tips of thier noses might be 2 inches off, let alone just dissimilar.  Even when they are 'generally similar', we haven't really done anything (programatically) about distinguishing between how the upper and lower lips might be lined up on the two meshes.  Teeth, eyes, noses, ears, even chins can all have these issues.

Since my early posts, I eventually changed my position on this issue to the extent that due to the nature of organic morphs, you don't have to exactly line up every feature to get decent results, but you probably still have to line up the meshes differently if you're doing ear morphs, vs lip morphs (obviously it depends on the two meshes in question).

So - if you have to manually line up the meshes, you'd use 'worldspace' and re-compute the correlation data after lining them up.

If you're transfering morphs from hi-res v3 to lo-res v3, then they probably line up well enough that you can do the correlation data in local space and only have to do it once.  I don;t recall off hand how similar V1 and V3 head shapes are, so I can't comment on how well that works for them.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 18 January 2007 at 12:42 AM

When the index is called up from the array, it can be converted using int(x).  It may have to be converted to float before adding it, using float(x), but that may not be necessary, or it may depend on how you go about adding it.  If you add a list of integers to a float array, that may work out differently than adding a single integer.  Or perhaps not.  Humm.

I follow you on lining up the meshes.  It's more or less what I've been thinking would be applied by having the user position bounding boxes to specify the locations of the features.  The reason I like that idea is that then the restricted run for a specific feature could be limited to comparing the verts and polys within those positioned boxes.  My other thought on lining up the meshes would be to not merge the bounding boxes at the start and determine the offset between the centers of the two boxes, to align them for the calculations.  The features bounding boxes would presumably also require an offset comparison....

I'm wondering how many of the calculations could be converted to use Numeric.  It might speed things if the vector math functions were re-written to use the Numeric module.  I'm not certain, however.  I'd have to test it to see if it makes a noticeable difference.  For all I know, invoking Numeric may introduce overhead which would offset the benefits of using it.... 

And the adjusted (and now mis-named) multi-div approach can't really be worked in until you're sure what needs to happen with linecast(), I assume.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 12:42 AM

As an aside... we might be mis-coummincating due to your goals/needs/expectations vs mine as well.  For example, you might only be interested in V1->V3 and you already have some idea of how well they line up.  My particular interest is in transfering figure morphs to clothing (which I think this application may be well suited for), but if we can get the head stuff working, I have uses for that as well.

I'm sure you've noticed by now that every time I send you the file, the 'worldspace' flag is set.  Every time you send me the file, it's cleared :).  Not a big deal - at least we're testing different aspects of it.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 18 January 2007 at 12:47 AM

*"I'm sure you've noticed by now that every time I send you the file, the 'worldspace' flag is set.  Every time you send me the file, it's cleared :).  Not a big deal - at least we're testing different aspects of it."

*It always makes me panic when my tests fail because of that.  I don't like to line up the props while it runs, because I watch the progress in the preview window.  (I also like to watch disk defragmenters running; Cage is a bit odd.)

I'll try to start setting it back for you.  I think more options are better than fewer.  If we can accomodate both the localspace and the worldspace, it is good.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 12:58 AM

The user-supplied bounding-box thing might be useful (I hadn't really thought about that much) - I have some questions about that, but I'd have to hear more about it first (for example, do you mean having them supply ONE set of boxes, surrounding some feature they are interested in? Or having them supply separate boxes labeled ears/nose/lips etc for the same pass?  If the former, then that might be useful as a culling method, but you run the risk of not including some morphed vertices.  If the latter, then I'd have some (many) other concerns).

re: Numeric

To tell the truth, I have no idea.  At some point, I need to go find the docs for it.  It may be faster, but personally I kinda like 'seeing' the code that does the calculations - at least until I'm convinced it's working correctly :).  Numeric may have some optimal methods of accessing the arrays and/or refined algorythms, but it can't do anything magic, so I hadn't been too concerned with it yet.

Let's change th name to 'inflated-regions approach' 😄.  And yes, if you felt like it, you could go ahead and work that in (most of that would be done from inside linecast() and shouldn't really impact linecast_loop()).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 18 January 2007 at 1:03 AM

No need to worry about setting it back :)... I generally look over the file before I run it to see what's changed anyway, so I can set it up.  Speaking of which, I just realized that I probably left the longest_edge thing commented out up in source_lists() - sorry.  I was trying some runs with that just hard-coded.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 18 January 2007 at 1:07 AM

Also, speaking of variables... have we ruled out any of them yet?  I personally don't see much use for the 'usecenter' code.  If I had my druthers, I'd prune all that out just to clean things up some.  We could also get rid of the 'mixboxes' variable unless the toggle is still needed in some cases in the local space runs (?).  For worldspace, it should always be true, so you could use that.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 18 January 2007 at 1:11 AM · edited Thu, 18 January 2007 at 1:12 AM

Heh.. also, btw, once you've read my various lengthy comments in the file, feel free to delete them where apropriate, to clean things up.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 18 January 2007 at 1:20 AM · edited Thu, 18 January 2007 at 1:23 AM

Usecenter could probably go, although it hasn't even been tested yet.  :-P  Mixboxes would be an option for localspace runs, yes.  I like to keep options open, to the point of cluttering things a bit....  (I learn, gradaully, to trim, however; I made no plans to try to introduce any kind of x-symmetry function, after it proved a hassle in the earler vert-to-vert version....)  I haven't been using it, but mixboxes was sort of central to one of the ideas I mentioned above.

The idea behind the Numeric.equal code was really that I was wondering if your vector_equal might run better before sending anything to the bounds check.  My early tests suggested that the edge bounds check was the source of most of the RAM leaking, and I've tended to want to avoid it as much as possible, since then....

Numeric runs off the compiled numpy.dll, so apparently there are speed gains to be had there.  All of this would be faster in a compiled language, I assume.  I'll hold off on Numeric tweaks, then.  It's become my new toy, after having been initially frightened of it.  :)

The response editor is mangling my posts again.  My apolgies for any garbled bits.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 1:49 AM

Ok.  x-symetry is an interesting idea, but may be something to look at later.  No problem on mixboxes, but I'd change the code so that if worldspace is set, then the mixboxes code has to take place.

On the Numeric.equal()... that in particular doesn't seem to be working, but yes, we could change that to use a distance calculation of some sort, but as I mentioned down in the comments in check_bounds(), I'm leary about leaving that test in there at all, at least until I figure out the best place to integrate the weighting code.  I'm not sure what might or might not be causing leaks, but I don't see anything particularly suspicious in check_bounds() (I wish this was all in C++! - I hate these damned typeless, interpreted languages! 😄 ).

True, Numeric is compiled, so that might speed things up.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 18 January 2007 at 1:59 AM

I'll change the defaults on mixboxes, then.  I'll also remove usecenter and comments, where appropriate.  And I'll add the inflate-regions bit.  I'll try to clean up as much of my mess as I can.  I am not a very orderly coder.  Fight the spaghetti.  cough 

The worst problem with x-symmetry was that it tended to confuse vertices with identical coordinates.  I couldn't figure out how to avoid that....

But the vectors_equal can't really be run as effectively before the edge bounds check?  We send it to the check to break the check early.  I've just wondered if it would make as much sense to use it to avoid even sending to the edge bounds function.  I have no idea what's going on with Numeric.equal.  I probably implemented it badly.  There may be a more appropriate Numeric function....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 18 January 2007 at 2:22 AM

"Let it go man - let it go!"  Hehe... yes - I agree 😄 - if possible, wherever possible, we will do whatever culling we can to help speed up the process (it might be there when the code comes back next time).  But, just to explain a little better why I'm reluctant to do that (before) now...

It's possible that the check_bounds() routine already has (or will have) information that I need for the weighting code, so (in order to make this run as fast as possible), it might be preferable to insert the weighting code right into there.  If it got by-passed, then we'd have to duplicate the code in multiple places, or re-generate information that we already had handy somewhere else.

So you see, I'm not nixing the idea (I'm all for optimizations wherever possible) - I'm just trying to keep the big picture optimizations in mind.

The vectors_equal() routine is a half-hearted effort to catch some special-cases where other math is falling apart, and while it's faster than doing a sqrt() to do the distance check, it's not as accurate and either method can pick up hits on vertices where the ray being cast didn't actually fall within that polygon... nothing wrong with that - the idea is that it's virtually in the same spot as the vertex being tested against, so we can just give that vertex a 1.0 weighting and the other 2 verts of that triangle 0.0 weights, but the weighting code needs to know we're doing it, erm... when that happens.  Does that make sense?

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.