Cage opened this issue on Dec 20, 2006 · 1232 posts
Spanki posted Sun, 07 January 2007 at 11:59 AM
Cage,
I've been thinking about Joe's and your relative level of success with dis-similar shapes, trying to understand why you've gotten even that far :). I'm a little confused about some of the results shown though, because the V4->V3 headshape image Joe posted seems to show a 'shape matching' approach (with the piggy-back verts ending up exactly where ather verts are), but you mentioned earlier that you were just applying deltas, but it looks like you still adjust 'all' verts and not just the ones involved in some morph. Or maybe whatever version of the script was being used was moving vertices to the position of other vertices (instead of applying a morph delta).
Anyway, back to the issue. Let's assume:
It seems to me that if you do only apply deltas in a way that means only some number of the vertices in Mesh B are moved (the ones that correlate to the vertices in Mesh A that are involved in the morph in question), then you can get reasonable results, because (organic) morphs in general tend to move multiple vertices in a similar direction and by a similar distance.
In other words, if the morph only moved a few vertices in a radical fashion, then unless you have a very strong correlation between Mesh A and Mesh B origin vertices, you'd get a poor result. But if you have an approximate match up between the lips of Mesh A and Mesh B, and the morph was to "raise the center of the upper lip", then you'd likely get decent results, because the selection of vertices involved would be similar. Of course depending on the match, you could get some vertices from the lower lip when you didn't want them and we've already mentioned the teeth being a problem.
Anyway, I think this helps explain why you and Joe are getting as good results as you are... the vertices may not be matched up exactly correctly, but all the vertices within the area around it are likely to move in the same/similar fashion with organic morphs.
So I think I've changed my opinion on the relative degree of acceptable results you're likely to achieve, but this still relies on the human positioning the meshes before running the script, as Joe was doing. This also means that you should line up the ears when transfering ear morphs and the noses when transfering nose morphs. If the ear morph is a "make lobe longer" morph, and the lobes are wildly dis-similar in shape and position, then you still have a problem (you should line up the lobes instead of the overall ear).
What I would recommend then is to let the user select which morphs to transfer. They'd position the meshes so that the noses lined up (for example) and select one or more 'nose' morphs, and hit the 'go' button. The script would then determine which vertices were involved for each morph and only move (or create deltas for) matching vertices (however that's determined) in the other mesh. The user could then reposition the mesh so that the lips lined up and run the script again to create those, etc.
I think if you did it this way, along with some method of excluding groups of vertices/polys (I think excluding might work better than including), then you could get good results in most cases (worse results where the shapes were drastically different, and/or when the morph is very specifically defined to a small local region).
[continued in next post]...
Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.