Thu, Nov 14, 5:37 AM CST

Renderosity Forums / Poser Python Scripting



Welcome to the Poser Python Scripting Forum

Forum Moderators: Staff

Poser Python Scripting F.A.Q (Last Updated: 2024 Sep 18 2:50 am)

We now have a ProPack Section in the Poser FreeStuff.
Check out the new Poser Python Wish List thread. If you have an idea for a script, jot it down and maybe someone can write it. If you're looking to write a script, check out this thread for useful suggestions.

Also, check out the official Python site for interpreters, sample code, applications, cool links and debuggers. This is THE central site for Python.

You can now attach text files to your posts to pass around scripts. Just attach the script as a txt file like you would a jpg or gif. Since the forum will use a random name for the file in the link, you should give instructions on what the file name should be and where to install it. Its a good idea to usually put that info right in the script file as well.

Checkout the Renderosity MarketPlace - Your source for digital art content!



Subject: Moving morphs between different figures


Cage ( ) posted Wed, 20 December 2006 at 2:55 AM · edited Thu, 14 November 2024 at 4:59 AM

file_363081.doc

...with different base meshes.  I'm  trying to see if I can port to V3 any of the Vicky 1 morphs over which I've sweated.  The attached script is a WIP which actually succeeds at transferring morphs between figures - to a surprising degree, considering my skill level.  (*cough* newbie *cough* math dunce *cough*)

But I have a couple of questions, and I'm hoping someone can at least point me toward learning resources.

Right now I'm finding the closest vertices between two meshes.  This is darned slow, as I loop though all of the verts in one mesh AND all of the verts in the other  - for each vert in the first.  Is there any method besides this which can be used to find corresponding (or near-corresponding) vertices?  38,000+ times 11,000+ is a lotta vertex comparisons when working with heads... and V1 is comparatively low-resolution, nowadays.

The script works fairly well going from a high-res mesh to a lower-res mesh, but the opposite effort results in jaggies in the final morph.  I need to know how to smooth the end result, or how to place vertices based on where they would intersect polygons, rather than moving them to match other verts.  Does anyone have any thoughts?

Finally, is there a way to find neighboring vertices in the same mesh, without looping through all of them once or more?  The only thing that seems to come close is the Polygon approach used by Ockham in a couple of scripts.  Umm.  Ideas?  Comments?

The attached script runs in P5, presumably in P6, I can't guess about P7.  Load two figures into a scene, then start the script.  Select the source figure, then the target figure, in the first listbox.  Then select an actor, then a morph, and click 'run'.  

Then wait... for about seven years... until the script has found the matched vertices for the corresponding actors.  It will save a data file with this information, so subsequent conversions are much faster.  (The data file will be placed in the poser.AppLocation folder, in case you want to get rid of it afterwards....)  The script will spawn a morph in the target figure's specified actor.  I've been testing with V1 and V3, as I said - the 'V3 to V2' version.  V3 to V1 offers problems with inner mouth meshes, but otherwise works well.  V1 to V3 gets the basic shape, but it's awfully ragged, with too many verts trying to piggyback on shared 'nearest' verts in the low-res source actor.  I'd like to improve this script, because I think it could be quite useful.  So please, if you have the know-how, let me know how.  :)

My apologies to anyone who looks at the code.  I'm a sloppy coder.  Yeeks.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


svdl ( ) posted Wed, 20 December 2006 at 4:10 AM

Finding matching vertices: yes, there is a way.

The base idea is overlaying both figures with a cube that is large enough to encompass both figures.
Divide that cube into subcubes. I think a 4x4x4 or 8x8x8 cube grid will work fine.

Now associate a vertex list with each of those cubes. Run through all the vertices of both models once, and assign each vertex index to it's "matching" cube, to be determined by its x,y and z coordinate.

Now if you're going to match a vertex in V1 you'll only have to check the V3 vertices that are associated with the same cube as the V1 vertex. 

By the way, the "cubes" are not actual geometries, they're just space segments.

This technique is a crude form of using an octree.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


svdl ( ) posted Wed, 20 December 2006 at 1:04 PM

Another idea: you could try moving the V3 vertices along their normals, to the point were the normal intersects the V1 mesh.
Not easy, I admit. The math to calculate the correct polygon to intersect is out there, but it's not as simple as I'd like it to be. And it'll probably be even slower than the matching vertices approach. But it will result in a far less ragged morph target.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


svdl ( ) posted Wed, 20 December 2006 at 1:33 PM

Assuming you know about the cross product between two vectors, here's how it goes:

  1. For a polygon in the V1 mesh, calculate the plane that goes through this polygon. I'm assuming triangular polygons here, if the polys are quads, this methot will be a little off, but not by much.
    If v1 is one vertex in the polygon (say, the first one), v2 the second and v3 the third, you can calculate the equation of this plane: ((v2-v1) crossproduct (v2-v3)) * X = ((v2-v1) crossproduct (v2-v3) )*v1
  2. The line through the V3 vertex along its normal can be represented by X=t*n + v, in which t is a scalar variable, n is the normal of the vertex, and v is the vertex itself
  3. Now you can determine the point where this line intersects the plane calculated in step 1, by substituting the X in 1) by the formula for X in 2)
  4. You've got the intersection point, let's call it X again. X is now a vertex, not a plane or a line. But does X lie within the borders of the polygon? Here's how you calculate this (again the crossproduct trick): calculate the crossproduct between the vectors (v2-v1) and (X-v2). The result is a vector perpendicular to the polygon. It can either point "outward" or "inward" - you can determine this by calculating the dot product of this resulting vector with the normal at v2. A positive result means pointing outward, a negative result means pointing inward.
    Remember "positive" or "negative"
    Now do the same between the vectors (v3-v2) and (X-v3) and the vectors (v1-v3) (X-v1).
    If all results are "positive" or all results are "negative", the intersection point lies within the borders of the polygon. You've found the right point.
    And if there's a mixture between "positive" and "negative", the intersection point lies outside the polygon edges, so you'll have to find another polygon that matches.
    Note: if the polygon is a quad, you'll have to calculate (v4-v3)x(X-v4) and (v1-v4)x(X-v1), not (v1-v3)x(X-v1).

This method should work for all closed meshes. Open meshes have a potential problem - the line through a normal of the source mesh might not intersect ANY polygon in the target mesh. In that case, it's probably best to add this vertex to a list of exceptions, and then try calculating its best delta from the deltas of the surrounding vertices.

I hope I haven't confused you too much....

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


svdl ( ) posted Wed, 20 December 2006 at 1:35 PM · edited Wed, 20 December 2006 at 1:46 PM

Uhm, step 1 in the above post can be somewhat easier: calculate the average normal of all the vertices of the polygon (result is n), calculate the average position of all the vertices (result is p), and the equation will be nX=np

You can also calculate the dot products from step 4 with this averaged normal.

Works for both quads and tris.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


svdl ( ) posted Wed, 20 December 2006 at 1:43 PM

The above method can be quite computationally intensive. You can reduce the amout of polys you have to check per vertex using the same "cube" trick.
Assuming you want to create the morph in V3, then V1 is the "target" shape.

Loop through all polys in the "target" shape, and for each cube that contains at least one of the poly's vertices, add the polygon index to the cube's polygon list.

Now you can check for each Vicki 3 vertex in which "cube" it lies, and you only have to check the Vicki 1 polys that are listed in this cube's polygon list.

This COULD go wrong, especially with vertices that lie close to a "cube" edge. By creating a separate array of "target" cubes - with the same centers as the "source" cubes, but sides twice as long, you can prevent that problem.

The pen is mightier than the sword. But if you literally want to have some impact, use a typewriter

My gallery   My freestuff


Cage ( ) posted Wed, 20 December 2006 at 6:57 PM · edited Wed, 20 December 2006 at 7:00 PM

This is a great help, svdl.  Thank you.  :)  And, no - you haven't confused me... too much.  I can make sense of it, so far.  I think.

I've been wondering whether trying to use x-mirroring for centered meshes would really reduce the process at all.  I'd still need to hunt through the geometry sets to find the symmetrical index, but I assume that would be faster and less intensive than running the comparison checks for everything.

I'm really frustrated with PoserPython.  Perhaps Blender Python spoiled me, but I kind of expected more (and more useful) internal functions for geometry handling.  If I didn't need to see quick test results as I go along developing, I think I would rather create this as a standalone for Python 2.4+.  I could probably gather all the information I need and organize it with a single parsing of each .obj file.  It seems to me that this would be faster.  Hmm.

I'll post this as it develops.  I'm just silly enough to think this is some kind of big breakthrough in morph-handling.  :-P  Maybe I should sit down and catch my breath.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Angelouscuitry ( ) posted Thu, 21 December 2006 at 6:52 AM

Wow, you two guys are working on this togther, and things look serious.

I'll kiss the feet of anyone who gets this to work!

:tt2:


Cage ( ) posted Thu, 21 December 2006 at 3:16 PM

Thanks, Angelouscuitry... but svdl is smarter than I am.  :)  I've only started it because I'm too dim-witted to realize how tricky it could be.  :)  If there really is so much promise in this, the overall idea would benefit from someone smarter than I am picking it up and running with it....

That said, I've made some progress toward integrating some of svdl's great suggestions.  I hope to have another WIP version to post tonight, if anyone wants to test it.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Angelouscuitry ( ) posted Thu, 21 December 2006 at 3:28 PM

Ca'nt wait!

I know you said you were writing this script to transfer from V1, V2 - V3; and I suspect once you cross any base subsequent base transfer would be easier, but would it be possible to convince you to start with V3 to V4, or would the body part differences be a pain?  I would think you'd find a much greater reception V3 -V4, as a lot of work, that's about to be thrown away for V4, has gone into V3!


Cage ( ) posted Fri, 22 December 2006 at 2:34 AM · edited Fri, 22 December 2006 at 2:35 AM

Umm... delays, delays.  I'm fighting with the bounding box/octree implementation.  Something isn't quite right, at the moment, and the current state of things is not an improvement - so a WIP update must wait....  I think my vertex indices are getting crossed somewhere, but it's a bit puzzling.  Some tests have worked out fine, but some are not so good.  Hmm.

Hypothetically the script could work with V4.  I've been testing with V1 and V3, with some success, but I also tested V1 to Posette and the V1 catsuit to the Judy catsuit.  These tests generally worked, but I suspect that the closer the meshes are to being lined up nicely by default, the better the results will be.  And this is still verrrrry sloowwwww.  Processing the V3 head takes about 20 minutes.  So far the octree hasn't really helped that.  Processing V4 is an idea that frightens me.  But once you had a datafile for the transfer, subsequent conversions would be faster.  I don't have V4, so I can't even try to test V3 to V4.  The version I posted above should be able to be tested with those figures, however.

Hopefully I can figure out the bounding box problems tomorrow and post something.  Sorry for the delay.  My poor math-retarded brain is struggling along as fast as possible.  :)

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


JoePublic ( ) posted Fri, 22 December 2006 at 1:20 PM

file_363360.jpg

IMHO this is the single most usefull python script I've ever used.

The original MIKI object has 115067 vertices, which makes her practically unuseable for me.
That's why I started creating my own LoRez variations of her, but as soon as the vertex count of the head is changed, all expression morphs are lost.

Thanks to your script I was now able to successfully transfer MIKI's expression morphs over to my reduced version and I now have a fully functional MIKI that weights only 28991 vertices !

I also successfuly transferred custom made body, face, and expression morphs from standard V3 over to V3RR.

Also started to create reduced resolution versions of all the other Unimesh characters using V3RR and M3RR as the base.

Initial conversion time was about 15min on my low-end machine starting with V3.

The only real "problem" I encountered so far is that with some expressions the teeth get distorted because some of their vertices are moved along with the lips.
But that is easily corrected in a modeller.

Maybe I can avoid this by making a "teeth gone" morph first for the figure that is converted.
Or maybe I keep the teeth a separate object and not part of the head mesh.

But other than that, the script worked perfectly for me in P5, even with my custom made LoRez MIKI.

SO THANK YOU VERY, VERY MUCH !


JoePublic ( ) posted Fri, 22 December 2006 at 1:22 PM

file_363361.jpg

And here is proof that the MIKI pictured above is indeed a custom converted LoRez mesh:


Cage ( ) posted Fri, 22 December 2006 at 2:37 PM

Glad to hear it.  :)  As I said, higher res to lower res works better at this point than the opposite, and the closer the two source meshes are to being matched, the better the results.

The problem with mouth geometries is a frustration.  I'm thinking about adding some controls using groups from materials zones and/or a normals check which will only correlate verts if the normals agree with on another within a tolerable range.  Assuming I can figure out how.  :)

Can you post anything showing the result with the teeth/mouth?  For now, you might use Ockham's inclusion/disclusion morph scripts to remove the bad areas from your morphs.  Hopefully I'll get this to the point where it can handle such things... or maybe someone brighter than I will take over the script idea when I peak out (yes, I keep dropping hints...).  :)

 

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


JoePublic ( ) posted Fri, 22 December 2006 at 2:54 PM

file_363374.jpg

Here is what happened to the teeth after I transferred MIKI's "smile wide" morph.

I have to look for that inclusion/disclusion script. That sounds like a good solution.

Thanks again.


Angelouscuitry ( ) posted Fri, 22 December 2006 at 4:00 PM

Cage - I was just about to ask of going from high to low, or low to high, thanks.  Any idea where to find the Vertex Counts of different figures?  I'm going to go start a new thread, and direct everybody back here.  My first goal with this script is to get a Face Room MT onto V3, M2, or D3.  I may try to get some of DAZ's body morphs over to the Face Room Figures, but I think the former would be much easier for now.

Currently, I have the script running; and I think I'm going from D3 to M2, but the GUI is a little confusing.  I think there should be 6 columns, 3 for each figure, rather than just 3.

So, do I click in the first column once, then the second column once, then the third column once, then the first column a second time, then the second column a second time, and then the third column a second time. 

Or click the first column twice, then the second column twice, then the third column twice?

I'm trying the former method now, on my faster PC.  I got a Python dialog straight off, but it does'nt say anything.  I'm taking this as not a complaint/error, and will let this run for another 10-15 minutes!

JoePublic - I think Ockham's MTbyGroup.zip is the script Cage is referring to.


JoePublic ( ) posted Fri, 22 December 2006 at 6:03 PM

Many thanks, **Angelouscuitry.

The Ockham script worked great.

**:thumbupboth:


Cage ( ) posted Fri, 22 December 2006 at 8:20 PM · edited Fri, 22 December 2006 at 8:22 PM

file_363396.doc

Here's an update.  The overall results are still the same, but mesh comparisons are running much faster.  I wasw running 20-25 minutes to compare V1 and V3.  Now it ranges between 4 and 7 minutes, depending on the options selected.

I have implemented x-mirroring for centered actors (no symmetry check is run, so this may not work on asymmetrical meshes) and svdl's 'crude octree' method, as well.  My crude octree is probably even more crude than he envisioned, but it seems to work.

The new options split mesh analysis from morph creation and also allow user specification of the number of 'octree' subdivisions and whether to use a merged bounding box for both actors, or separate boxes.  I'm getting the best results with a single box and subdivisions set to 2.

JoePublic, that looks a lot like what I've seen with inner mouth geometries, so far.  I hope one of my ideas can help screen against that outcome.  Crossing my fingers.

Angelouscuitry, you can get the vert counts by writing a short Python script to check the NumVerts() of an actor's Geometry().  A loop through all actors would probably be the way to get the full count.  I'll see if I can add something to display vert counts.

The GUI is confusing?  Sorry.  I have a terrible time with GUI layouts.  But here's the process:

-In the first listbox (far left) select the source figure, then the target figure.
-Second listbox will display actors in the source figure.  Currently this isn't checking to verify that the actor also exists in the target, but I should probably add that.
-Select the common actor to use in both source and target.
-Third listbox will display all morphs in the source figure's selected actor.  
-Select on of the morphs in the third listbox.  This is the morph we will copy to the target figure.
-Select bounding box options.
-Click 'Analyse' to run the mesh analyses OR click 'Run' to copy the selected morph.  If Run is selected when there is no datafile existing, you will be asked whether you want to go ahead and run the analysis.
-Processing now displays the length of time taken by the run once it's done. 

*"I'm trying the former method now, on my faster PC.  I got a Python dialog straight off, but it does'nt say anything.  I'm taking this as not a complaint/error, and will let this run for another 10-15 minutes!"

*I don't understand what you're asking.  :(  Can you clarify?

Let me know if this new version causes any problems.  Timing results have been oddly inconsistent, and I have seen one bad morph result, for unknown reasons.  Generally, however, it gives the same results as the original, much more quickly.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Sat, 23 December 2006 at 1:10 AM · edited Sat, 23 December 2006 at 1:12 AM

If any of the Python wizards are listening, I'm encountering a puzzling error, with which I need at least a logic check, hopefully advice.

When I apply the conversion morph target, I am taking the original morph deltas plus the difference between the 'target' actor's coordinates and the 'source' actor's coordinates.  This creates a morph which then needs to have the same difference subtracted.  Initially, I created a 'difference' morph and set it to negative1.  So I adjusted the code to try to integrate that difference morph.  (see below.)

        a = geom2.Vertex(int(i)) #source
        b = geom1.Vertex(matched) #target
        #adjust the morph delta by the difference between the two source meshes
        diff = (b.X()-a.X(),b.Y()-a.Y(),b.Z()-a.Z())       
        deltaX = (mt[0] + diff[0]) - (b.X()-a.X())
        deltaY = (mt[1] + diff[1]) - (b.Y()-a.Y())
        deltaZ = (mt[2] + diff[2]) - (b.Z()-a.Z())   

And that seems to work... some of the time.  I've favored running tests from V1 to V3.  This code works for that situation.  Going the other way, it causes problems.  The above is what I've had posted in these scripts.  But I had actually intended to subtract (a-b) coords from the (mt+diff) portion.

Umm.  So, my question: what the bloody heck is the math representing a morph dial set to negative1?  I can't seem to get the results to turn out consistently right without going back to adding the compensatory dial and setting it to negative.  There has to be a mathematical expression for this....  I assume I'm just mixed up about the proper formula.

See, I'm really not so bright.  :)  Don't get to hyped up about this, folks.  Cage is struggling with simple things.  😊

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Angelouscuitry ( ) posted Sat, 23 December 2006 at 5:51 AM · edited Sat, 23 December 2006 at 6:03 AM

:closedeyes:

Not so bright ? :blink:  This is awsome ! :tt1:

I've been using Poser since version3, and nobody, but nobody(/me keeps Fingers Crossed...) has even scratched the surface of Morph Transfer between bases!  At first I could'nt imagine how it had'nt been done, but 7 Yrs. later I've become a believer! :ohmy:

You should see things from my perspective, watch:

Err..When I first ran the script I moved the Body xTran of D3 -3.5, and the Body xTran of M2 +3.5, Just for a good view.  Should I have lest them at 0?

...would the script benefit at all if I were to scale the figure to a similar Size?

See, I've completely subverted the attention of this thread...:unsure:...:sad:...:crying:


nruddock ( ) posted Sat, 23 December 2006 at 8:26 AM

Quote - If any of the Python wizards are listening, I'm encountering a puzzling error, with which I need at least a logic check, hopefully advice.

When I apply the conversion morph target, I am taking the original morph deltas plus the difference between the 'target' actor's coordinates and the 'source' actor's coordinates.  This creates a morph which then needs to have the same difference subtracted.  Initially, I created a 'difference' morph and set it to negative1.  So I adjusted the code to try to integrate that difference morph.  (see below.)

        a = geom2.Vertex(int(i)) #source
        b = geom1.Vertex(matched) #target
        #adjust the morph delta by the difference between the two source meshes
        diff = (b.X()-a.X(),b.Y()-a.Y(),b.Z()-a.Z())       
        deltaX = (mt[0] + diff[0]) - (b.X()-a.X())
        deltaY = (mt[1] + diff[1]) - (b.Y()-a.Y())
        deltaZ = (mt[2] + diff[2]) - (b.Z()-a.Z())   

And that seems to work... some of the time.  I've favored running tests from V1 to V3.  This code works for that situation.  Going the other way, it causes problems.  The above is what I've had posted in these scripts.  But I had actually intended to subtract (a-b) coords from the (mt+diff) portion.

Umm.  So, my question: what the bloody heck is the math representing a morph dial set to negative1?  I can't seem to get the results to turn out consistently right without going back to adding the compensatory dial and setting it to negative.  There has to be a mathematical expression for this....  I assume I'm just mixed up about the proper formula.

See, I'm really not so bright.  :)  Don't get to hyped up about this, folks.  Cage is struggling with simple things.  😊

Unless I'm mistaken, then what your ending up with is deltaX = mt[0] etc.
If you expand your equations for the deltas you get, e.g. deltaX = mt[0] + b.X() - a.X() - b.X() + a.X()

For a morph set at -1.0, then the calculated delta values should be -mt[0] etc.

HTH


Cage ( ) posted Sat, 23 December 2006 at 3:07 PM · edited Sat, 23 December 2006 at 3:19 PM

I panicked last night.  Sorry.  I think the equation I want, which will integrate the negative 'difference' dial setting into the calculation of the main delta, would be:

final_delta = (source_vert + original_delta) - target_vert.

The equation seems to require that the difference between the two meshes be taken into account when adding the deltas.  Previously, I haven't considered re-calculating the original delta before trying to modify it by the difference between the two meshes.  But I haven't tested this yet....

Thanks for the response, nruddock.  Sorry I wasn't quite coherent in the message to which you respond.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Sat, 23 December 2006 at 3:23 PM

Angelouscuitry - this script only looks at the two selected actors' base geometries and at the selected morph's deltas.  The worldvertex positions aren't taken into account.  So any scaling, translation, morphing, rotationg,bending, deforming of the selected actors will have no effect on the outcome.  This also means that when you're looking at vertex counts, to determine higher-res vs. lower res, that you want to compare the counts for the selected actors, not for the figures as a whole, just FYI.  Hopefully the comparative vertex counts will be irrelevant before this is done.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Sat, 23 December 2006 at 5:01 PM · edited Sat, 23 December 2006 at 5:15 PM

Well, the equation I posted above doesn't work. Bummer.

Let me try to clarify the problem, then.

original_shape = source_vertex + source_delta

final_shape = target vertex + final_delta

I can't just transfer morph deltas directly, because this will be inadequate except when the source and target geometries are almost identical (as in the case of the reduce Miki, above). The goal, here, is not to transfer the delta literally, but to transfer the final shape represented by the source morph target, as a new delta appropriate for the target mesh.

So I’m struggling to find the proper formula for the final_delta, above, which requires that the difference between the source_vertex and the target_vertex be part of the equation. When I simply transfer the original delta from one figure to the next, the shape isn’t quite correct. The V1 head explodes when V3 delta positions are passed without modification.

I have found, however, that the following works well for all testing figures. This requires that a subtraction morph be created in the target actor.

final_delta = original_delta + (source_vertex – target_vertex)

difference = source_vertex – target_vertex

final_shape = final_delta - difference

This, then, produces a correct final_shape when morph final_delta is set to 1 and morph difference is set to –1.

I want to get the same effect with a single morph. I need to integrate this subtraction morph, difference, into the final_delta calculation. I initially tried:

final_delta = (original_delta + (source_vertex – target_vertex)) - (target_vertex – source_vertex)

final_delta = (original_delta + (source_vertex – target_vertex)) - (source_vertex – target_vertex)

final_delta = (original_delta + (source_vertex – target_vertex)) - (target_vertex – source_vertex)

* *

But how can this be expressed in a single equation and created as one morph rather than a morph minus a subtraction morph? This is where Cage gets confused. I’ve tried all the combinations of variables I can think to try, but each result either explodes, translates the target actor, or requires a subtraction morph to be correct. I’m beggining to suspect that Poser is doing something more than I know when it applies morph targets.

Does anyone have any ideas? Am I making sense? I hope.... L

* And thank you in advance for any help or advice....*

seems to work nicely when moving morphs from V1 to V3. When moving from V3 to V1, it makes the head explode. So something obviously isn’t quite right yet., which is the same as just passing the original delta, as nruddock points out.)What I need to figure out is: what formula can be used to calculate a final_delta which represents the final_shape achieved by using the subtraction morph. It should be possible to do this all at once. But what is really happening when Poser applies the subtraction morph? The final shape is target_vertex + final_delta – difference. seems to work nicely when moving morphs from V1 to V3. When moving from V3 to V1, it makes the head explode, something I failed to realize before posting the script. So something obviously isn’t quite right yet., which is the same as just passing the original delta, as nruddock points out.). (That’s what I had believed I had posted in the initial script, but apparently I changed it to . (That’s what I had believed I had posted in the initial script, but apparently I changed it to

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Sat, 23 December 2006 at 5:23 PM

Okay I OFFICIALLY HATE all the changes to Rosity's format.  Editing time limits and automatic re-fomatting of my post.  HATE!!!  Grr!

The above post was re-formatted somehow, perhaps because I unknowingly included html flags or something.  It should be this, assuming the same thing doesn't happen again.

"""
 

Well, the equation I posted above doesn't work. Bummer.

Let me try to clarify the problem, then.

original_shape = source_vertex + source_delta

final_shape = target vertex + final_delta

I can't just transfer morph deltas directly, because this will be inadequate except when the source and target geometries are almost identical (as in the case of the reduce Miki, above). The goal, here, is not to transfer the delta literally, but to transfer the final shape represented by the source morph target, as a new delta appropriate for the target mesh.

So I’m struggling to find the proper formula for the final_delta, above, which requires that the difference between the source_vertex and the target_vertex be part of the equation. When I simply transfer the original delta from one figure to the next, the shape isn’t quite correct. The V1 head explodes when V3 delta positions are passed without modification.

I have found, however, that the following works well for all testing figures. This requires that a subtraction morph be created in the target actor.

final_delta = original_delta + (source_vertex – target_vertex)

difference = source_vertex – target_vertex

final_shape = final_delta - difference

This, then, produces a correct final_shape when morph final_delta is set to 1 and morph difference is set to –1.

I want to get the same effect with a single morph. I need to integrate this subtraction morph, difference, into the final_delta calculation. I initially tried:

final_delta = (original_delta + (source_vertex – target_vertex)) - (target_vertex – source_vertex)

(That’s what I had believed I had posted in the initial script, but apparently I changed it to

final_delta = (original_delta + (source_vertex – target_vertex)) - (source_vertex – target_vertex)

final_delta = (original_delta + (source_vertex – target_vertex)) - (target_vertex – source_vertex) seems to work nicely when moving morphs from V1 to V3. When moving from V3 to V1, it makes the head explode. So something obviously isn’t quite right yet., which is the same as just passing the original delta, as nruddock points out)* **.

*What I need to figure out is: what formula can be used to calculate a final_delta which represents the final_shape achieved by using the subtraction morph. It should be possible to do this all at once. But what is really happening when Poser applies the subtraction morph? The final shape is target_vertex + final_delta – difference. *

But how can this be expressed in a single equation and created as one morph rather than a morph minus a subtraction morph? This is where Cage gets confused. I’ve tried all the combinations of variables I can think to try, but each result either explodes, translates the target actor, or requires a subtraction morph to be correct. I’m beggining to suspect that Poser is doing something more than I know when it applies morph targets.

Does anyone have any ideas? Am I making sense? I hope.... L

 And thank you in advance for any help or advice....

"""

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Sat, 23 December 2006 at 11:28 PM · edited Sat, 23 December 2006 at 11:30 PM

file_363470.doc

No one?  Ack.

Anyway.  This updates to compensate for the potential problem that I babble about above.  An option is offered to create a morph using straight delta conversion, as in the previous posts, or to create a morph which uses a subtraction morph dial.  I'm not pleased with that approach.  I think I may be able to work something out by creating a prop from the actor geometry, adding the morph and subtraction morph, then transferring the result of mixing these two morphs back to the actor and deleting the prop.  All of that seems like it should be unnecessary, but I am, sadly, too math-dumb to puzzle out what's happening.

If you want an illustration of what I'm babbling about up there, fire up this script and load V1 and V3 and try to transfer a morph to V1 using the straight deltas option.  The morph will explode.  Then create the same morph in V1 using the subtraction dial option.  This will create a morph which combines the original deltas with the difference between the source and target meshes, along with a morph with just the differences.  The difference dial, set to -1, gives a corrected morph in V1.  Which should be possible without going through this whole dial process.  Why the morph explodes is beyond me....

 

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


kawecki ( ) posted Sun, 24 December 2006 at 12:07 AM

Refining more what svdl started:
To calculate the normal of a vertice you must do:

  1. Calcul;ate the normal of each face. You do it by averaging the x,y,z coordinates of each vertex of this face by a formula that I have not in this moment in this computer.
  2. Add to each vertex the normal of the face that is sharing.
  3. Normalize the resulting vector.
    Too much complicated?, there's another more easy way to do it.
    Import the original's figure obj file into Poser and export it again as obj file.
    In the obj file you have the normals!!!!!
    The normals are stored in the obj file as a table
    vn x,y,z
    vn x,y,z
    ...........
    To find to which vertice a normal belongs you must look at the face data in the obj file.
    The face data is stored as
    f v/vn/vt v/vn/vt v/vn/vt .....
    f v/vn/vt v/vn/vt v/vn/vt ......
    ...........
    where v = index to vertex table, vn = index  to normal table and vt = index to uv table
    (All indexes start with 1 and not 0 !!!!)
    Parsing the face data you can create a table of normals indexed by the vertex number..

If you want to transfer the morph to only one body part, you must only export the respective body part and not the whole figure.

You must built the normal table for the figure that will receive the morph target, it doesn't matter if have less or more vertices than the morph source figure.
Once you have the normal table you must find for each vertex the point where the normal intersects some face of the morphed figure (whole figure or only the body part).
The coordinates of the intesection point will be the morph value for this vertex.
How to find the point where the normal intersect some face is another problem and another headache, but at list finding the normals is very easy......

Stupidity also evolves!


kawecki ( ) posted Sun, 24 December 2006 at 12:16 AM

Stupidity also evolves!


kawecki ( ) posted Sun, 24 December 2006 at 12:22 AM

Damn editor, always eats my posts!!!!

Before you export the obj, you must resize the morphed figure so it will overlap more or less the destinatary figure. If you don't do it, you have the risk that the new figure once morphed, the morph will appear in some unwanted position (a nipple in the abdomen or head)

Stupidity also evolves!


Cage ( ) posted Sun, 24 December 2006 at 12:32 AM

Thank you, kaweki.  I've been having trouble with the reply editor, myself.

I've thought about different approaches which might involve reading a .obj file directly.  I'm a bit puzzled, though.  Poser gives access to vertex normal data, doesn't it?  Would the method you're suggesting simply be faster than looping to query the normals for an actor using Poser's internal methods?

I think it might be easiest to use CreatePropFromGeom using the actor.Geometry(), then export that and re-import it.  Then I don't have to mess around with concerns about zeroing the figure and all its morphs.  I've been thinking about using that approach for a couple of things.

I have a feeling a lot of this is going to be trickier than I expect...  :scared:

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


kawecki ( ) posted Sun, 24 December 2006 at 12:57 AM

I have no idea what Poser's Python allows to do, maybe has many useful information and tools.
I never had used Python. What I do is stand alone applications in C or Assembler.

Stupidity also evolves!


ByteDreams ( ) posted Tue, 26 December 2006 at 9:25 AM · edited Tue, 26 December 2006 at 9:26 AM

Well, whatever you guys are doing, sure looks intense!  I just browsed through this thread to its end, skipping over most of the code parts. 
What a great idea, and I'll be watching to see if you get your script running in the reverse as you seem to be working on.  I don't know how to write python code.  I just used this feature a couple of weeks ago for the first time after purchasing one of  philc's bundle!  So I ventured into this forum to see if there was something else someone was cooking up that I could give a try.

Good to see svdl and nruddock in here.  I'm already fond of them both.


Cage ( ) posted Tue, 26 December 2006 at 11:54 AM

There's a long way to go with this.  :(  I'm spinning some horrible spaghetti code.  Anyone who's interested in "fighting the spaghetti" should, perhaps, try to take this basic idea and make of it something that works.  Fight the spaghetti!

Umm.

I should have a minor update later.  The basic functions present in the initial post weren't working as well as I believed, but that seems to be fixed now, thanks to some helpful input from Ockham.  So the script is currently pretty good at copying from a higher-res actor to a lower-res one.  The trick is going to be working in the opposite direction.  It's looking to me like the whole premise may need to be re-thought to approach that effectively....
 

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Wed, 27 December 2006 at 3:40 PM

file_363787.doc

This version integrates Ockham's method for normalizing the deltas as they pass between actors.  It also makes minor changes to the default settings.

-X-mirroring is off by default.  The process can speed things up a bit, but it can mirror the wrong vertex when multiple verts have the same coordinates in the mesh.  It also fails with centered actors which aren't really symmetrical.

-The octree subdivisions now default to zero.  Zero is 'no cuts', using a single bounding box without subdivisions.  The best setting seems to be 1, which is one cut per edge, for 2x2 on each side or 8 total subdivision regions.  Settings above 1 are slower than using 1, in my tests.  The current 0 setting is faster than the original script.

The overall process hasn't changed, however.  I'm having some trouble deciding how to proceed.  So far, suggestions have focused on using vertex to polygon surface comparisons, but it seems that this wouldn't really be compatible with the current vertex-to-vertex methods.  Switching to vert-to-surface would seem to me to require that all comparison calculations be run anew for each morph, since the morph surfaces will be different each time.  The vert-to-vert process can make the main comparisons only once, then save the correlations to a datafile, which speeds up final morph creation.  I'd prefer to find a way to correct the misplaced vertices without having to use surface comparisons.

The problem is that going 'downhill' from a higher-res mesh to a lower, all verts in the high-res mesh are forced to find correlated verts in the lower-res mesh.  That leads to 'piggybacking' of vertices and improper placement.  If I can find the 'true' matched verts and then adjust the neighboring 'false' matches somehow, it would seem to me that vertices can be used for the whole process.  But can this be done?  Can I use the delta of the 'true' match and somehow adjust the 'neighbor' matched verts by their offset relative to the 'true' match?  It seems like something akin to Ockham's normalization process should help accomplish this, but my efforts so far fall flat, as it were.

So, at risk of being an ongoing annoyance, does anyone have any thoughts on all of this?  Can vertex-to-vertex be rescued, or should the process be rejected and the entire procedure of the script be re-thought?

Any ideas, or suggested resources I could study?  Am I thinking about things the wrong way?

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 27 December 2006 at 4:36 PM

Just some random thoughts (may have already been covered)...

I'm thinking that one approach would be that a vertex (in the target mesh) is influenced by a weighting, per corresponding surface vertex in the source mesh.  In other words, you cast a vertex normal (average of all polygon normals that use this vertex) out until you hit a source mesh polygon and then compute a weighting value for each vertex of that polygon, relative to the distance from the ray intersection with that plane/polygon/surface.  You could then store data to link up each target mesh vertex with a list of weights for source mesh vertices.

When you're transfering the morph, you'd multiply any deltas by the weighting value before adding it to weighted deltas from the other source vertices for this target vertex.

I'm not positive that this is a 'correct' method, but I hope that I explained it clearly :).  I'm thinking it might work for low->hi mesh as well as hi->low mesh, but I might be wrong (I haven't given it enough thought yet).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Wed, 27 December 2006 at 4:58 PM

file_363799.jpg

The above was based on observing the above sample (shaded mesh is hi-res sub-D version of white source mesh, blue lines show source mesh with a vertex 'morphed')...

Note that only vertex A in the target mesh would get the full weight of such a morph, whereas B and C would get (in this case) half the weighting, D would get less than half and E,F,G would effectively get none (they are pretty much entirely influenced by the vertices directly above them).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 27 December 2006 at 7:53 PM · edited Wed, 27 December 2006 at 7:54 PM

I like that idea a lot!  Wow!  It bridges the gap between the surface method and the vertex method.

The concept has me speculating about how one might build upon such weight mapping to convert UV layouts between different objects, too.  Hmm.  I'm not sure how well it would work for converting base shapes of objects between one another....

A lot of potential.  It makes me wish I were smarter.  :)  

So it looks like I need to start on a complete script re-build, implementing a 'raycasting' technique to compare verts to surfaces.

Thank you!  Wowsers!

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Wed, 27 December 2006 at 8:18 PM

Just don't get too excited yet - I'm not sure it's a valid theory :).  I like to build test-cases to help visualize things... but the above is a fairly specific/controlled example and may or may not be universally applicable.  It was just a thought-trail, but I think it's probably worth a try.

I'm also not convinced about it working on the low->high case yet, but I hadn't paid close attention to what you've already covered, so I'm not sure it would be any worse than other methods either.

As for the implementation, you or someone else better with the math will have to figure that out :).

Have fun!

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Wed, 27 December 2006 at 11:09 PM

Sadly I'm not to be lumped into the class of those who might be better with the math.  :(  Hoo boy.  Unfortunately, low to high is the situation that needs something like this.  High to low is what is currently working, at least with meshes which are fairly similar in shape and position....

These internets, they need a math search engine.  The Googles and whatnots are not helping me find math learning resources.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 28 December 2006 at 5:23 AM

Sorry, I probably mixed up terms with you... what I meant was that I'm not sure if this method covers transfering morphs from a hires mesh to a lower res mesh (which sounds like something you already have working).

I was mostly considering the current problem of transfering morphs from a lower res mesh to a higher res mesh with the above theory.  I think you may also already have a lot of the code around to implement the above (like determining the normal->polygon intersection), so the only tricky part would be determining the weights from that info.

The general idea of this theory is that the vertices of the hi-res mesh are linked to and influenced by the (generally larger) polygons in the low-res mesh, instead of specific vertices.  But then converting polygon movements to vertex movements is done by a weighting of each vertex within that polygon.  So if the polygon is a triangle, then the hi-res mesh vertex is influenced by 3 weighted vertices, with the weight values of the 3 vertices adding up to 1.0 (if it was a quad, then it would be 4 weighted vertices, with the weights agin adding up to 1.0).

You might also need to (first) determine 'which' polygon gets linked to the vertex... if the normal->polygon intersection calculation is actually substituting a 'plane' for the polygon, then each vertex could collide with many polygons.  If that caclulation limits itself to just the bounds of the polygon, then you should be ok.

There's a special-case where the normal vector of a vertex intersects with one of the vertices of a polygon (which is actually the case for vertex A, E ang G in my example above), but that will likely wash out in the code so that that vertex's weight would end up being 1.0  and the other weights for that polygon would all be 0.0.

The other special-case is when the normal vector intersects with an edge of a polygon (B, C and F in my example), but the weighting calculations should end up giving non-zero values to just 2 vertices in that case - you just have to be aware that some vertices will get 'hit' on more than one polygon when you're building the weight tables (in the example above, F would get a hit on the polygon with the "Target mesh (shaded)" text in it, as well as the one above it where I put all the labels, but in both cases, the vertices above the E and G vertices would be the only ones that had some non-zero weight (in this case, 0.5 each)).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 5:38 AM

...having said that :), in my mind, I 'know' for example that movements of the vertex marked O/M in my example above should not affect vertex F, but what I'm still fuzzy on would be the algorythm used to make that so by giving it a 0.0 weight, while the vertices above E and G ended up with a 0.5 weight.

Of course I'm just using this as a specific example (and a special-case example, which has some 'known' properties), so what I'm really talking about is the weight determining process in general (that, as I said, I'm fuzzy on).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 28 December 2006 at 2:34 PM

Hi-res to low-res is working pretty well, at least when the geometries line up tolerably well.  (JoePublic has found some limitations to this technique when meshes don't align well.)  But the model currently in use involves checking each vertex in the target and looking for the nearest vert in the source, so it's built on completely different foundations than what you and svdl and Ockham suggest, using vert to polygon comparisons.

In thinking about this, the method of weighting the target vertices does seem like the tricky part.  I assume it would somehow be a function of distance from each vert divided by length or width of the poly, or distance along one of the planar axes of the poly.  But Poser, unlike Blender, only gives us bare-bones data about polygons, and no data at all about edges.  The polygon distances would have to be constructed from the set for the poly.  I can imagine that requiring the use of trigonometry.... ick... and then I start to get panicky.  It seems like the idea has a lot of promise, but I despair for my ability to implement it.  Cage != math guy.  Hoo boy.  Right now I'm still struggling to implement the ideas svdl outlined for 'raycasting', back on the previous page....

I'm thinking about ways this could work, though.  I can tell it's a strain because of the smoke coming out of my ears....

I'm beginning to wonder if all of this should use kaweki's suggestion for parsing the object data out of a .obj file, instead of using the Poser internal methods.  Would it be faster?  One pass through the .obj file could be used to organize all the data needed for polygons and verts, and the line information is in the .obj for edges....  Hmm.

Thank you for all your thought about this.  I hope I can live up to all of the ideas everyone is contributing.  :scared:

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Thu, 28 December 2006 at 2:59 PM

Quote - ...In thinking about this, the method of weighting the target vertices does seem like the tricky part.  I assume it would somehow be a function of distance from each vert divided by length or width of the poly, or distance along one of the planar axes of the poly.

Yeah, that sounds right.. I'm hoping one of the math guys will come and fill in the blanks :).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 4:34 PM

file_363924.jpg

Ok, this might be some progress.. or at least something to work with...  for purposes of discussion, there is a polygon (from the lo-res mesh) with vertices labeled A, B and C.  The points labeled 1, 2, 3, 4 are intersection points, where the hi-res mesh vertices intersected this lo-res mesh polygon...

Point 4  is 25% the distance from vertex C to vert B
Point 1 is the same distance from point C as point 4 is
Point 2 is equal distant from A, B and C
Point 3 is half-way between A and B

Ok, given the above, trying to come up with the weighting algorythm... we want to make sure that:

Point 4 is not affected by A (A's weight should be 0.0 for point 4)
Point 4 should be affected more by C than by B (.75 weight for C, .25 weight for B).
Point 1 should be affected by point C just as much as point 4 is, but is also affected by both A and B
Point 2 should be equallly affected by movements in A, B and/or C (weights for A,B and C should all be 0.33333 repeating, or 33 and a third percent)
Point 3 should not be affected by C, but is equally affected by both A and B

...it looks like the algorythm that solves all of the above conditions is that the weight for a particular vertex (A, B or C) is determined by 1.0 minus the distance of the intersection point from the vertex, relative to the distance from that vertex to the edge of the remaining vertices of the polygon, along the line determined by the vertex and the intersection.

So if:

distance to edge along the line between vertex and interesection point  is EdgeDist
distance from vertex to intersection point is **ISectDist
**weight for this vertex is calculated as:

weight = 1.0 - (ISectDist divided by EdgeDist)

So for example, let's look at point 4 and calculate the weights for each vertex, with the above information...

Vertex A:

Let's assume the distance from A to the intersection ( ISectDist) is 8.5 ....note that the distance from A to the edge between C and B along the line between A and 4 (EdgeDist) is exactly the same.. 8.5 so...

WeightA = 1.0 - (8.5 divided by 8.5) or 1.0 - (1.0)  = 0.0  ...so the weight for vertex A for intersection point 4 is 0.0  which is exactly what we want.

Vertex B:

Let's assume an EdgeDist  value of 8.0 (the distance from B to the line between A and C, along the line generated between B and 4). As stated above, point 4 is 25% the distance from C towards B, so conversely,  that would make IsectDist = 6.0 (75% of 8.0) so...

WeightB = 1.0 - (6.0 divided by 8.0) or 1.0 - (0.75) = 0.25  ...so the weight for vertex B for intersection point 4 is 0.25 which is as expected.

Vetex C:

This is basically the oposite of the vertex B calculations...

WeightC = 1.0 - (2.0 divided by 8.0) or 1.0 - (0.25) = 0.75 ...also expected.

...you can run the numbers on the other situations, but I think they'll hold up fine.

Now, this works fine for triangles, but what happens with quads or n-gons? ... I hadn't thought about that yet :) so I'm hoping it's a natural extension of the above.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 4:46 PM

..just a note - I drew the diagram by hand, eyeballing it, so it's not exact.  If you want to run numbers on the other points, either use my 'stated' assumptions about the relative placements of everything, or draw your own figure to do measurements on :).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 4:57 PM

...Also, the interesting proof of the above is found in comparing how the weights work out on intersection point 1, compared to point 4.

My assumption above is that the distance between C->4 is the same as C->1...  so it goes to reason that the weight for C for point 1 would be the same as the weight for C for point 4 (0.75).

At the same time, the weight for B for point 1 can't be 0.25, because we have to give some weight to A for this point.  But notice that the distance from point 1 to the line between C and A, along the line generated by B->1 is much smaller than the distance from 4 to the line between C and A... so this makes WeightB smaller and the rest goes into WeightA for point 1.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 5:03 PM · edited Thu, 28 December 2006 at 5:06 PM

BTW, after just a few minutes thought on the quad/n-gon situation.... it seems to me that the line generated between some vertex of any n-gon and the intersection point can only intersect with one other edge of the n-gon (at least for convex polygons), so you just have to find the right edge.

Edit: but you probably have to worry about non-planar polygons once you get above triangles... otherwise, that line may not intersect with any edge :).

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Spanki ( ) posted Thu, 28 December 2006 at 5:36 PM

...after a little more thought, this general method might not work on quads/n-gons.

I have to put more thought into it, but my first thought would be to break the n-gon into triangles and compute a new (poetential) intersection point for each.  This would solve the non-planar issue, but I have to think about how all the weighting works out.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Cage ( ) posted Thu, 28 December 2006 at 9:31 PM

Holy Moley!  You're thinking about this more seriously than I am!  Thank you!

I'll have to read the above and reply.  Right now I'm racing to keep from being booted off by AOL....

I've written a class to organize all of the object data, so it will now be fairly easy to have a better range of reference lists up front.  I think I'm just about ready to start mucking about with the raycasting....

If the above will only work for tris, then the polygons can be virtually split into tris, I suppose... as you say.  Hmm.

Hmm hmm....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Cage ( ) posted Fri, 29 December 2006 at 12:03 AM

You said the above illustration is kind of rough, so maybe I'm just confused by that....  But aren't two points of equal distance from a third point on a curve?  A circle is the set of all points of equal distance from a central point.  Uh.  Right?  Does this affect the premise which starts from points 1 and 4 being of equal distance, or am I fixating on the illustration?

Well, darn it... I have code to gather all the information, but it slows everything down again!  So back to the drawing board for the umpteenth time....

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


Spanki ( ) posted Fri, 29 December 2006 at 2:03 AM

Yes, my intention when I wrote that was that points 1 and 4 were both positioned on a curve/circle, an equal distance out from vertex C.  But it turns out that that's not really what I was after and so I probably mis-stated it's position (sorry about that)...

The case I really wanted to test was what I was thinking of as a 'hinge effect'.  So it would be the situation where the shortest possible distance from point 1 to the line A-B was the same distance as a (parralell) line through point 4 to the line A-B (so a line through 1-4 would be paralell to line A-B).  In other words, if you imagined the line A-B as a hinge, and tipped the polygon 'up' off a table by lifting vertex C, then points 1 and 4 would move by the exact same amount (point 3 would 'rotate' but not move... point 2 would move less than 1 and 4).

I really should have graphed this out to check the math (I'll probably do that for sanity's sake), but I think it still works.

Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.