odf opened this issue on Dec 16, 2021 ยท 223 posts
odf posted Thu, 16 December 2021 at 10:40 PM
Hat tip to @primorge for inspiring the thread title.
Subdivision morphs were among my favorite things after my return to Poser earlier this year. Sadly, it seems that at the moment the only two ways to make them are with the Poser Morph Tool or via ZBrush/GoZ. I'd like to be able to make subdivision morphs with any modeling software that can read and write OBJ files, so I'm working on some Python code to make that happen.
Here's the workflow I'm aiming at (a.k.a. my evil plan):
1) In Poser, load a figure, convert to unimesh, then export the whole figure as an OBJ at base resolution.
2) In the modeling software, import that file, subdivide and sculpt, then export as a new OBJ at the highest resolution.
3) Call the command line Python script I shall write with both OBJs to produce a PMD and accompanying PZ2 for a Poser morph injection pose.
4) Back in Poser, bring the figure to the desired subdivision level, apply the injection pose and rejoice.
I'm not entirely sure the PMD path covers all possible applications (e.g. can I inject post-transform morphs such as JCMs via PMDs?), but it should be a good start. I have no plans of writing an integrated morph loader in Poser Python, but I'd happily provide all the necessary geometry crunching code (even in Python 2.7 if desired).
So much for now. I'll be posting progress updates as I go. Let me know what you think.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 17 December 2021 at 5:55 AM
You can inject post transform, translation, rotation, scaling (that's all I've tested) via pmd. I generally attach such transforms to an empty body master and it works every time. Doing so via create new master has given me injections with dead dials. To create an empty body master have the figure completely 0 (or use a zeroed dev version, even better) and Figure:Spawn Full Body Morph. Right click the resulting dial and delete morph. Untick only the dial in the Body, which will delete all the resulting empties, except for the Body dial, which were created during the spawn. The Spawn command creates a dial in all actors, including parented props (you'll have to individually delete the empties in Goal/Center of Mass, or simply exclude them from the inject).
odf posted Fri, 17 December 2021 at 6:20 AM
Nice! Thanks for confirming that it can be done with PMDs.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 17 December 2021 at 11:37 PM
Apparently in Poser 12, I can just use 'Figure:Create Full Body Morph' to create that dial in the body and nothing else. Since I don't have Poser 11, I can't tell it that's a new feature.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 18 December 2021 at 12:53 AM
Ha, I just spent an hour hunting down the weirdest "bug". Turns out that if I apply a morph to write out the result as an OBJ, I should either recompute the normals or delete them, so that when I test the output in Blender, it doesn't look like it applied the morph to the shoulders and legs, but not the abs. *headdeskemoji*
Don't mind me. On with your regularly scheduled program.
-- I'm not mad at you, just Westphalian.
primorge posted Sat, 18 December 2021 at 3:33 PM
Oh really? Lol. Never tried that :D... it's in Poser 11. Actually, I can't say that I haven't tried that, there may be some reason that I'm not recalling why I don't use Create. I'll go back over it sometime. Right now it's a "if it ain't broke" scenario with only very minor inconvenience. Thanks for the heads up.Apparently in Poser 12, I can just use 'Figure:Create Full Body Morph' to create that dial in the body and nothing else. Since I don't have Poser 11, I can't tell it that's a new feature.
primorge posted Sat, 18 December 2021 at 3:39 PM
Most programs calculate normals on load. I don't think I've ever encountered an instance where deleting the vn lines from OBJ had any effect upon importing to various environments. It's my understanding there are certain instances where Poser recalculates normals on the fly during certain processes, or so I've heard. Can't remember the particulars but I think it involved solving some scene bug, and it was a workaround, perhaps discussed by Bagginsbill.Ha, I just spent an hour hunting down the weirdest "bug". Turns out that if I apply a morph to write out the result as an OBJ, I should either recompute the normals or delete them, so that when I test the output in Blender, it doesn't look like it applied the morph to the shoulders and legs, but not the abs. *headdeskemoji*
Don't mind me. On with your regularly scheduled program.
odf posted Sat, 18 December 2021 at 4:28 PM
Most programs calculate normals on load. I don't think I've ever encountered an instance where deleting the vn lines from OBJ had any effect upon importing to various environments. It's my understanding there are certain instances where Poser recalculates normals on the fly during certain processes, or so I've heard. Can't remember the particulars but I think it involved solving some scene bug, and it was a workaround, perhaps discussed by Bagginsbill.
All I know is that when RDNA adapted Antonia, they removed the normal data from the OBJ because supposedly Poser does not use it and always recomputes the normals on load. I've never checked that myself, but I guess it can't hurt to leave the normals out and save about 30% of disk space for storing the file.
But Blender definitely uses what's in the file unless one recomputes the normals explicitly. That can be useful when one does elaborate mesh surgery and wants the normals to stay consistent, so it makes sense for a program like Blender to use them. For Poser, it's probably much less useful.
If you're really keen on seeing the effect, I can send you a demo file. :smile:
-- I'm not mad at you, just Westphalian.
odf posted Sat, 18 December 2021 at 4:54 PM
primorge posted at 3:33 PM Sat, 18 December 2021 - #4432073
odf posted at 11:37 PM Fri, 17 December 2021 - #4432043Okay, I had a closer look. The "create..." command makes a valueParm, whereas "spawn..." makes a targetGeom.Oh really? Lol. Never tried that :D... it's in Poser 11. Actually, I can't say that I haven't tried that, there may be some reason that I'm not recalling why I don't use Create. I'll go back over it sometime. Right now it's a "if it ain't broke" scenario with only very minor inconvenience. Thanks for the heads up.Apparently in Poser 12, I can just use 'Figure:Create Full Body Morph' to create that dial in the body and nothing else. Since I don't have Poser 11, I can't tell it that's a new feature.
Incidentally, is there a way to get rid of the channel in GoalCenterOfMass? I can do all the body parts and CenterOfMass, but I don't see a way in Poser to access GoalCenterOfMass. Not really important, just curious. Could also just be a kink of Antonia's.
-- I'm not mad at you, just Westphalian.
primorge posted Sat, 18 December 2021 at 5:23 PM
Hardcore in the midst of voxels in 3dcoat right now. I think my computer may melt. Dude, the retopo stuff in there truly is incredible. Anyway, I may be delayed in my 'V4/M4 Products Now' and 'Nova_Project' commitments this weekend as a consequence. I shouldn't put time of delivery statements in my posts.
Anyway.
Select GoalCenterOfMass in the Hierarchy Editor, you have to have Show Props ticked, that will bring it up in parameters. The dials will then be visible for the prop. Right Click the Parm. Delete Morph. I guess that's what you're asking...
primorge posted Sat, 18 December 2021 at 5:32 PM
I just checked, Antonia definitely has a GoalCenterOfMass accessible via Hierarchy Editor.
odf posted Sat, 18 December 2021 at 6:26 PM
I shouldn't put time of delivery statements in my posts.
Good idea! :thumbsup:
Ah dangit, I forgot about that Hierarchy Editor again. Thanks a lot, that did it.Select GoalCenterOfMass in the Hierarchy Editor, you have to have Show Props ticked, that will bring it up in parameters. The dials will then be visible for the prop. Right Click the Parm. Delete Morph. I guess that's what you're asking...
-- I'm not mad at you, just Westphalian.
odf posted Sun, 19 December 2021 at 4:08 AM
Looks like I'm getting close. I forgot that the deltas used a different coordinates system from the OBJs. The results are not quite correct yet in Poser, but there's a resemblance. Any day now... :grin:
-- I'm not mad at you, just Westphalian.
primorge posted Sun, 19 December 2021 at 1:23 PM
primorge posted at 3:33 PM Sat, 18 December 2021 - #4432073
odf posted at 11:37 PM Fri, 17 December 2021 - #4432043Okay, I had a closer look. The "create..." command makes a valueParm, whereas "spawn..." makes a targetGeom.Oh really? Lol. Never tried that :D... it's in Poser 11. Actually, I can't say that I haven't tried that, there may be some reason that I'm not recalling why I don't use Create. I'll go back over it sometime. Right now it's a "if it ain't broke" scenario with only very minor inconvenience. Thanks for the heads up.Apparently in Poser 12, I can just use 'Figure:Create Full Body Morph' to create that dial in the body and nothing else. Since I don't have Poser 11, I can't tell it that's a new feature.
Missed this part. So a valueParm is just a master dial, essentially the same thing as the Create New Master command, whereas targetGeom is a morph dial. So, PMD carries morph data and morph data associated dials... you might want to check whether using Create Full Body Morph (valueParm) will carry over to injections in various instances. As I've said previously Create New Master sometimes results in dead dial injections, from my experience. Especially when linking up non morph type controllers such as pose dials. Though nerd has said "just use Create New Master" I've had enough failed PMD injects for these types of things to just habitually use an empty Spawned master (targetGeom), which works through inject, for me, every time. Probably in the past I got burnt by create full body morph for controller dials via inject.
odf posted Sun, 19 December 2021 at 5:43 PM
Oof!
So it turns out the coordinates for subdivision deltas are actually given with respect to the tangent space. The first coordinate is along the normal, the second seems to be the surface tangent that lies in the xz-plane, then the last coordinate is orthogonal to both. I'm assuming that makes it easier for the renderer to munch them up or something.
That's not the case for regular morph deltas, is it? They are just figure-space x, y, z, aren't they? It seems that I would have remembered if they were all fancy-like.
Also, is there maybe a parameter for telling Poser how to interpret deltas that I've forgotten about?
As I said, any day now...
-- I'm not mad at you, just Westphalian.
odf posted Sun, 19 December 2021 at 6:54 PM
The tangent space approach looks good, but I need to figure out the exact rules for the second and third coordinates. I'm going to have to do science to it (see illustration, curtesy of Dresden Codak), unless I can find a volunteer oracle, i.e. someone who knows the actual implementation and will tell me what it does.
-- I'm not mad at you, just Westphalian.
primorge posted Sun, 19 December 2021 at 9:00 PM
Forgive my ignorance but would looking at Pixar OpenSubDiv stuff be of any use? Or have you been doing that all along? Or is it not relevant to the problem?
https://graphics.pixar.com/opensubdiv/docs/subdivision_surfaces.html
primorge posted Sun, 19 December 2021 at 9:11 PM
You might also want to peruse this... some of the info might be relevant now or down the road
https://www.renderosity.com/forums/threads/2939685
odf posted Sun, 19 December 2021 at 9:55 PM
Thanks primorge, but I don't have the patience to read through walls of text on the off chance that I might find some pertinent information in there.
I did some experiments with a sphere and the same constant delta for each vertex in order to see what happens. It seems Poser uses different computations for the second and third coordinate depending on which octant the normal is in. Very strange! The best I've been able to do so far is to only use the component along the surface normal and ignore the other two. That produces very nice results, but it does not cover all the subd morphs one might want to make.
I'd say I give up at this point, but I know my brain won't let go of this, so I'll probably be coming back and trying more things.
-- I'm not mad at you, just Westphalian.
odf posted Sun, 19 December 2021 at 11:17 PM
To illustrate, here is the Poser high-res ball at subdivision level 2 with the exact same delta (0.0, 0.01, 0.0) applied to every vertex (at morph strength 2.88). Honestly, I don't think Pixar would be selling any movies if they wrote graphics code like that. Pretty sure it's original Poser.
-- I'm not mad at you, just Westphalian.
odf posted Sun, 19 December 2021 at 11:45 PM
It looks suspiciously as if they did something sensible and coherent for the vertices of the original mesh and then had some interpolation accident happen on the rest. There's a clear pattern there, so one should be able to reverse engineer what they did, but I'd need some strong motivation to attempt that kind of thing, like - I don't know - Alison Brie promising to teach me wrestling moves.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 20 December 2021 at 3:13 PM
Thanks primorge, but I don't have the patience to read through walls of text on the off chance that I might find some pertinent information in there.
Sorry, that sounded like I didn't appreciate you digging up information for me. I do. Both links look like they could come in handy at some point, just - as far as I can tell at a quick glance - not right now. So once more with feeling: thanks!
It's true though that I tend to be quite impatient, have a short attention span and also to make it worse, bad memory. So unfortunately linking me to a long document may not always be as helpful as it sounds.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 20 December 2021 at 5:33 PM
Poser, I want you to know that I'm not mad at you, just disappointed. :smile:
Seriously, that was a bit of a shock yesterday, but I think I'm over it. Let's see what we've got so far...
We can make subdivision deltas from a morphed OBJ file (plus the original subdivided OBJ) and stuff them into a PMD file, that Poser can then load as a functional full-body morph target. The proof is in the six pack, here a sloppy Blender sculpt of Antonia at subd level 2:
The limitation is that for the moment, we can only reliably produce deltas that move vertices along their normals (because lateral deltas make the mesh go crunch). That might not be as bad as it sounds if we can bake the lateral movements down into base level deltas. That has the potential of losing a little bit of accuracy and detail, but my feeling is that it will be good enough for a lot of actual morphs. There is the problem that going back between Poser and [insert external program used for morphing] might eventually create artifacts due to the accumulated information loss* as has happened (or possibly is still happening) with GoZ.
At any rate, I will look into baking down the deltas next, which means more experiments with balls and controlled deltas. In the meantime, I'll also look around for alternative ways to get morph data into Poser, e.g. a special parameters that might tell Poser to interpret the deltas differently, or something useful in the Poser Python API (although I do like to work with files and will probably only switch to working inside Poser as a last resort).
Anyway, if we don't see each other before then,
Merry Xmas from Subdivision Santa
* This reminds me of how when I was a wee Uni student, I used to misuse the library Xerox machines to make copies of copies of magazine photos because I was fascinated by how they changed from step to step. Unfortunately in this case I don't think we can expect results that are anywhere near as attractive.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 20 December 2021 at 9:16 PM
Interesting: the effect of the lateral deltas gets much more consistent when I explicitly specify empty sets of deltas for all the preceding levels. There's still an undulating pattern in there that reminds me off one of the illustrations in the Pixar doc. So I reckon I'll deep dive into that next.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 20 December 2021 at 10:01 PM
Pffft, now I can't reconstruct the problem from yesterday anymore. I'm always getting that undulating pattern. Fun times! :smile:
-- I'm not mad at you, just Westphalian.
odf posted Tue, 21 December 2021 at 12:48 AM
Poser go home, you're drunk.
-- I'm not mad at you, just Westphalian.
adp001 posted Tue, 21 December 2021 at 6:02 AM
The limitation is that for the moment, we can only reliably produce deltas that move vertices along their normals
We can do the same with a displacement map. No morph needed.
adp001 posted Tue, 21 December 2021 at 6:21 AM
A wild guess: Is the data perhaps a normal map?
primorge posted Tue, 21 December 2021 at 2:39 PM
odf posted at 5:33 PM Mon, 20 December 2021 - #4432163I personally love using displacement maps. Unfortunately Superfly doesn't, which alienates a certain constituency.The limitation is that for the moment, we can only reliably produce deltas that move vertices along their normals
We can do the same with a displacement map. No morph needed.
odf posted Tue, 21 December 2021 at 3:23 PM
adp001 posted at 6:21 AM Tue, 21 December 2021 - #4432191
If I understand correctly, a normal map would only encode a normal direction, not a displacement magnitude? I'm not sure what exactly displacement maps do. Can they include lateral displacements, or only go along the normal? At any rate, from what I've seen now, subdivision morphs definitely have a similar flavor.A wild guess: Is the data perhaps a normal map?
It also seems that I've been thinking about this too much like a mathematician and not enough like a programmer. The erratic looking behavior I see when I apply the same delta to all vertices of a mesh makes a lot more sense when I assume that the coordinate directions are computed from the per-face data with as little effort as possible. For example, the morphed cube shown above would have to be perfectly symmetric if the actual vertex normals had been used for displacement. If face normals were used instead, the might explain the irregularity, because then the direction used depends on which face is chosen for each vertex.
Anyway, I'll be doing more experiments, and I'll also start a separate thread in this forum to ask about how subdivision deltas work exactly. Maybe we're lucky and someone in the know stumbles upon it.
-- I'm not mad at you, just Westphalian.
odf posted Tue, 21 December 2021 at 3:38 PM
PS: Come to think of it, maybe it would indeed be worthwhile if I tried to find some details on how the local coordinate directions are computed for displacement maps and such. There's a decent chance that the Poser developers borrowed some code or at least some ideas from those.
-- I'm not mad at you, just Westphalian.
adp001 posted Tue, 21 December 2021 at 5:13 PM
Morphs are downward compatible. This means that the standard morph data must be available. To make it a hires morph, I would take the pre morphed polygon and subdivide it further with "normal data". That should be good for any kind of subdivision.
Just an idea.
Asking someone would certainly not be a bad idea. Why don't you try to contact the programmers directly? Maybe through Nerd, he seems to be quite cooperative, if it brings something
adp001 posted Tue, 21 December 2021 at 5:25 PM
If I understand correctly, a normal map would only encode a normal direction, not a displacement magnitude?
That's the idea for normal displacement without changing the actual geometry, yes. But at the end RGB colors are interpreted as XYZ coordinates.
I have no access to Poser at the moment. Otherwise I would try doing a standard morph, then add subdividing and push just one of the new vertices and look at the output data.
adp001 posted Tue, 21 December 2021 at 5:34 PM
odf posted Tue, 21 December 2021 at 6:27 PM
Morphs are downward compatible. This means that the standard morph data must be available. To make it a hires morph, I would take the pre morphed polygon and subdivide it further with "normal data". That should be good for any kind of subdivision.
Yes, that's my back-up plan. That's more or less what I meant by "baking down to the base resolution," which probably wasn't clear enough. I can of course do arbitrary displacements at base resolution, so that'll give me the basic shape. At higher resolutions I can do displacements along the normals more or less reliably. I say more or less because of course the normal is also just approximated in some way.
The problem with the tangent (and then also the bitangent) is that it can be any direction along the surface, so the programmer has a full 360 degrees to pick from, and which one is used is purely a matter of convention.
-- I'm not mad at you, just Westphalian.
odf posted Tue, 21 December 2021 at 7:08 PM
This is what my current test rig looks like:
I use two identical cube meshes out of Blender at subdivision level 1. One is unmorphed, the second one has three morphs M1, M2 and M3 with constant deltas in the first, second and third coordinate direction, respectively, for all the level 1 vertices. I'll call those directions n (for normal), u, and v, respectively. The base level vertices stay where they are.
It seems that for each level 1 vertex, either u or v follows one of the edge directions adjacent to that vertex. So far, though, I haven't figured out when u and when v is chosen, and which of the edges is used. The next step could be to export the morphed results and run a script on them that relates the displacement to the OBJ data in order to see if there's some discernable pattern (e.g. always move along the first/last edge the vertex appears in).
-- I'm not mad at you, just Westphalian.
odf posted Wed, 22 December 2021 at 1:59 AM
Okay, so the idea with the normal map was actually quite helpful. The u- and v-directions follow the texture u and v, which is quite nicely visualized by putting a tile pattern on lo-res Antonia, making a subd morph with a constant delta of either [0.0 0.01 0.0] (u-direction) or [0.0 0.0 0.01] (v-direction) and playing with the dials.
-- I'm not mad at you, just Westphalian.
adp001 posted Wed, 22 December 2021 at 8:42 AM
adp001 posted at 6:02 AM Tue, 21 December 2021 - #4432190
odf posted at 5:33 PM Mon, 20 December 2021 - #4432163I personally love using displacement maps. Unfortunately Superfly doesn't, which alienates a certain constituency.The limitation is that for the moment, we can only reliably produce deltas that move vertices along their normals
We can do the same with a displacement map. No morph needed.
Of course Cycles (or Superfly in Poser) can handle displacement maps. If you want to use displacements like Hires morphs, you have to set the same Sub-D level. That (maybe together with normal maps) is all that is needed.
Advantage: With displacement/normal maps one saves resources considerably. No need to carry around huge amounts of morph data in the figures or props. Whether you need them or not. This is different with displacement/normal maps: They only become active at render time. And only if the maps are activated.
That was the reason why I didn't try very hard to find out how hi-res morphs work in Poser. I don't need them.
Nice, if someone who knows comes over with the information. Then it can be incorporated where it for one or the other could make sense (those are not many anyway and it's getting less and less).
primorge posted Wed, 22 December 2021 at 4:23 PM
primorge posted at 2:39 PM Tue, 21 December 2021 - #4432204I didn't feel the need to point out that I've used displacement maps via subdivision and render time micropolygonal displacement. Render derived displacement is relatively very much more practical in terms of resources. In Poser at least I also find "live" subdivision based displacement a bit fiddly. You and ODF are hilariously condescending in tone many times but I don't hold it against y'all... you sometimes come up with, albeit very left-brain, interesting ideas ;)adp001 posted at 6:02 AM Tue, 21 December 2021 - #4432190
odf posted at 5:33 PM Mon, 20 December 2021 - #4432163I personally love using displacement maps. Unfortunately Superfly doesn't, which alienates a certain constituency.The limitation is that for the moment, we can only reliably produce deltas that move vertices along their normals
We can do the same with a displacement map. No morph needed.
Of course Cycles (or Superfly in Poser) can handle displacement maps. If you want to use displacements like Hires morphs, you have to set the same Sub-D level. That (maybe together with normal maps) is all that is needed.
Advantage: With displacement/normal maps one saves resources considerably. No need to carry around huge amounts of morph data in the figures or props. Whether you need them or not. This is different with displacement/normal maps: They only become active at render time. And only if the maps are activated.
That was the reason why I didn't try very hard to find out how hi-res morphs work in Poser. I don't need them.
Nice, if someone who knows comes over with the information. Then it can be incorporated where it for one or the other could make sense (those are not many anyway and it's getting less and less).
odf posted Wed, 22 December 2021 at 4:57 PM
I didn't understand most of those words, so... :grin:
The reason I'm so interested in hi-res morphs is that they would allow me to dial in local detail selectively, say some extra wrinkles to help with an expression or some veins on just a hand. Also they'd supposedly be able to fully participate in the Poser ERC circus. I agree that for full-body sculpts it probably makes more sense to bake the hi-res detail into displacement and/or normal maps (although I lack the personal experience, so that's just a guess).
Also I love me a challenge - until it starts looking too hard, then I hate me a challenge - until it starts looking doable again, then I love me a challenge. And so on, ad infinitum.
Hilarious condescension is actually a mandatory subject at German schools. One simply won't get a high school diploma without passing it.
Seriously though, there's one for the New Year's resolutions.
-- I'm not mad at you, just Westphalian.
odf posted Wed, 22 December 2021 at 5:02 PM
I personally love using displacement maps. Unfortunately Superfly doesn't, which alienates a certain constituency.All that said, I'd be interested to know what you mean when you say Superfly doesn't love displacement maps. Also I wonder if there's a difference between Poser 11 and 12 in that respect.
-- I'm not mad at you, just Westphalian.
primorge posted Wed, 22 December 2021 at 5:36 PM
I've experimented with trying to get various displacement maps I've made in the past in Zbrush and Photoshop to work in conjunction with subdivision via the Physical Surface Root. It works, but it's not practical compared to Firefly render time displacement. The comparitive amount, for the resolution maps I have at least, of subdivision I'd need to use is just too much. Just doing it via Firefly is way more practical with better results IMO. So Poser displacement is another thing that I feel Firefly does better at the moment. I use 11, I doubt much has changed in 12 regarding.
primorge posted Wed, 22 December 2021 at 5:53 PM
Sorry. I'm not sold on Superfly... at all. For me it's a downgrade, just as Poser 12 would be a downgrade. Sigh. I don't give a hoot if it's "more real" it's all fake regardless, and there's established fake methods that produce results that are real enough for my art purposes besides. And much more quickly.
I'm a bit worried about Poser actually. But I'm drifting off topic and I don't have any beef with Poser; One of my favorite pastimes.
primorge posted Wed, 22 December 2021 at 6:12 PM
I should probably clarify a couple things that might be confusing in my wording ODF. I'll imagine you're not too familiar with the differences between Firefly and Superfly displacement. In Firefly the displacement happens via the polygons being extremely subdivided via triangulation during render, the detail of the displacement dependant on the map resolution.
Superfly on the other hand doesn't do this micropolygonal subdivision, you have to have subdivision active at a level that will accommodate the detail of the maps you're using. So if I were to sculpt and generate detail maps in Zbrush at say 6 levels of subdivision while sculpting I'd have to accommodate with a close level of "live" subdivision in Poser to capture the same level of detail as I'd need for the Superfly render to look right.
primorge posted Wed, 22 December 2021 at 6:23 PM
It's probably most practical in either instance to leave serious height displacement up to render displacement and fine details up to bump or normal. In either case. Firefly still wins. And subdivision morphs are definitely novel but half baked (no pun intended) at this time. As you pointed out, being able to mix and link to dependencies is the really interesting thing about HD morphs... but you still have to build your mesh with specific features in mind to be at all practical.
odf posted Wed, 22 December 2021 at 6:31 PM
Thanks for the explanation! I was vaguely aware of the distinction, but Firefly hasn't been on my radar since coming back to Poser, so I mostly forgot.I should probably clarify a couple things that might be confusing in my wording ODF. I'll imagine you're not too familiar with the differences between Firefly and Superfly displacement. In Firefly the displacement happens via the polygons being extremely subdivided via triangulation during render, the detail of the displacement dependant on the map resolution.
Superfly on the other hand doesn't do this micropolygonal subdivision, you have to have subdivision active at a level that will accommodate the detail of the maps you're using. So if I were to sculpt and generate detail maps in Zbrush at say 6 levels of subdivision while sculpting I'd have to accommodate with a close level of "live" subdivision in Poser to capture the same level of detail as I'd need for the Superfly render to look right.
-- I'm not mad at you, just Westphalian.
odf posted Wed, 22 December 2021 at 8:14 PM
So now I'm actually wondering what to work on first: the "baking down" of the displacement to the lower subdivision levels (all the way down to the base level) or the fixing of the lateral displacement in the higher levels.
The former seems to deliver more value right away because it would take care of all the low-frequency lateral displacement, which can be handled easily at the base level, and it would produce a morph that can be used at any desired subdivision level. Which I think is kind of crucial, because if we always rendered everything at the highest resolution, we could just as well use higher-res figures to start with.
The latter seems more like the icing on the cake (no pun intended). Depending on how close I can get to the actual u and v directions used inside Poser, I might potentially and with some luck get a bit more accuracy in the details relative to the original sculpt. Note the pile of qualifications there. :grin: But honestly, I have no good intuition as how important the higher frequency lateral displacement is in the "real" world. I suspect the answer, as so often, might be it depends.
Opinions?
-- I'm not mad at you, just Westphalian.
JoePublic posted Sat, 25 December 2021 at 6:42 AM
Sorry to be "that" guy, but wouldn't an animated displacement texture be a more straightforward solution?
https://www.renderosity.com/forums/threads/2901601
So the problem is you want wrinkles to go along with a smile morph. (Or think how the soles of your feet wrinkle when you bend them. So far no Poser figure has yet solved that problem.)
So just ERC link a displacement map to the smile morph, and that's that.
It won't show in preview, but most people don't care about a "nice" preview, anyway.
(I'm one of the few who really, really LOVE a "nice" preview, so I'd more than happily see subdivision work properly in Poser without slowing my machine down to a crawl)
In my experience, subdivision makes the Morphbrush jerky and unresponsive very quickly, and you need quite a bit of it to create really convincing wrinkles.
I'm enjoying this thread, even if I don't understand ALL of it. ;-)
But from a practical point of view, as long as Poser's subdivision doesn't run as effortlessly as it does in ZBrush, wouldn't it be better to use it only in a scenario where it is "less harmfull" to the user experience?adp001 posted Sat, 25 December 2021 at 10:06 AM
I think it should be a combination of basemorph (SubD-0) and displacement/normal map. The morph takes care of the vertex shifts mainly in X/Y and the displacement/normal map takes care of height and depth. The intensity of the displacement map can be coupled to the morph. The result should be similar to a SubD morph.
Nevertheless, the problem remains that a high SubD value must be set for the entire mesh. The displacement is only effective with Superfly if a sufficient number of vertices is provided by SubD.
So the fact that Poser slows down with increasing detail remains even with this method.
Since real micropolygons are not feasible with PBR render engines, the Blender programmers have found an elegant solution to this problem in Cycles: They increase the SubD level depending on the distance to the camera (called adaptive subdivision). The result is simply impressive.
JoePublic posted Sat, 25 December 2021 at 10:45 AM
"Nevertheless, the problem remains that a high SubD value must be set for the entire mesh"
*
Wouldn't that be awesome if a subdivision area could be controlled by a map or a morph?
*
I recently tried a very crude "real world" solution to that problem:
I exported a subdivded torso and stitched the unsubdivided arms and head back on to create a "high res / low res hybrid mesh", as most detail "happens" in the torso area anyway. (Ribcage, shoulder blades, spine)
Unfortunately the UV-coordinates got messed up along the seams, so I abandoned the project.
odf posted Sat, 25 December 2021 at 5:27 PM
Sorry to be "that" guy, but wouldn't an animated displacement texture be a more straightforward solution?
https://www.renderosity.com/forums/threads/2901601
So the problem is you want wrinkles to go along with a smile morph. (Or think how the soles of your feet wrinkle when you bend them. So far no Poser figure has yet solved that problem.)
So just ERC link a displacement map to the smile morph, and that's that.
Thanks for the link! I'd been wondering how to make dials for material room properties. It seems Poser is making that really easy these days. :thumbsup:
It won't show in preview, but most people don't care about a "nice" preview, anyway.
(I'm one of the few who really, really LOVE a "nice" preview, so I'd more than happily see subdivision work properly in Poser without slowing my machine down to a crawl)
I hadn't thought of that, but being able to see the morph in preview and getting an idea of how it catches the light is another plus for subdivision morphs, which I do care about.
In my experience, subdivision makes the Morphbrush jerky and unresponsive very quickly, and you need quite a bit of it to create really convincing wrinkles.
That's exactly it. I think there's a sweet spot for subd morphs at one or two levels of subdivision where Poser can handle the polys nicely in preview and render and one gets just enough resolution out for some nice extra detail. BUT the Poser morph tool gets overwhelmed really quickly and it's just no fun making these morphs in Poser. The only other option I know of is ZBrush via GoZ, and I guess I'm not yet ready to commit to ZBrush, both money- and interface-wise. So I'd like to be able to just make subd morphs for Poser in any old 3d modeler like Blender or Wings3d or Mudbox or what have you. Hence this project.
I realize now that "wrinkles" was a poor choice of words on my part. I was thinking more about larger-scale skin folds that appear with a smile or frown. Those should be a good application for subd morphs. Fine wrinkles indeed not so much.
I'm enjoying this thread, even if I don't understand ALL of it. ;-)
But from a practical point of view, as long as Poser's subdivision doesn't run as effortlessly as it does in ZBrush, wouldn't it be better to use it only in a scenario where it is "less harmfull" to the user experience?
Like a said above, I think the user experience using subd morphs up to a certain level is actually quite good, and arguably better than displacement maps if one wants some degree of flexibility. The user experience creating them, not so much, and that's the part I'd like to try and improve.
Here's a little visual aid, or should I say aide? Antonia at subdivision level 2 with several morphs made with the Poser morph tool. Painful to make, put so nice to work with once I had them. The full images are in my gallery.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 25 December 2021 at 5:38 PM
"Nevertheless, the problem remains that a high SubD value must be set for the entire mesh"
*
Wouldn't that be awesome if a subdivision area could be controlled by a map or a morph?
*
I recently tried a very crude "real world" solution to that problem:
I exported a subdivded torso and stitched the unsubdivided arms and head back on to create a "high res / low res hybrid mesh", as most detail "happens" in the torso area anyway. (Ribcage, shoulder blades, spine)
Unfortunately the UV-coordinates got messed up along the seams, so I abandoned the project.
Interesting! I would have guessed that the weight maps would be the hardest part to get to match. Shows how much I know. :grin:
Anyway, a tool for partially subdividing a figure while keeping the UVs matched up properly also sounds like an interesting project. I'm working on some subd code in Python right now. (I've implemented Catmull-Clarke style subd in Java, JavsScript, Elm and Scala so far, so Python was long overdue. :smile:) In principle, it should be easy to adjust the interpolation at the boundary of the torso so that it matches up with the non-subdivided parts. But usually the devil is in the details, so I wouldn't promise anything.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 25 December 2021 at 11:15 PM
I think it should be a combination of basemorph (SubD-0) and displacement/normal map. The morph takes care of the vertex shifts mainly in X/Y and the displacement/normal map takes care of height and depth. The intensity of the displacement map can be coupled to the morph. The result should be similar to a SubD morph.
From what I've seen, that seems to be a fairly common strategy in other 3d software. In Blender I think I know how to save the base morph into an OBJ and bake the higher levels down into a displacement map. So the missing part would be wiring this up in Poser. Could potentially be automated via Poser Python. :wink:
-- I'm not mad at you, just Westphalian.
JoePublic posted Sun, 26 December 2021 at 6:09 AM
"Interesting! I would have guessed that the weight maps would be the hardest part to get to match. Shows how much I know."
The weightmaps usually just need bit of resmoothing. If you go from a high-res figure or bodypart to a low-res one, most often no adjustment is necessary at all.
*
Hmm, one crazy idea:
Would it be possible to transfer a morph made on a high-res object over to a subdivided low-res object?
Like:
I make a morph with the morphbrush on standard resolution V4.(Which is basically a "frozen" subdivision of V4 17K)
I have done so in the past, and the Morphbrush can easily handle standard V4 or V3.
*
Then one would have to transfer that (high-res) morph over to a subdivided V4 17K. (Which basically is the same mesh, just subdivided once.)
Wouldn't that make the creation part of the morph much easier?
I know it's a "detour", but I think the "on the fly" subdivision, especially if it's more than once, is what's slowing the Morphbrush down, not the absolute polygon count of the figure itself. (Within reasonable limits, of course)
Just thinking out loud, so please bear with me. :-)
odf posted Sun, 26 December 2021 at 4:11 PM
Would it be possible to transfer a morph made on a high-res object over to a subdivided low-res object?
You mean inside Poser? That would be very neat. If you find a way to reliably do that, I can throw my script in the recycle bin.
I know it's a "detour", but I think the "on the fly" subdivision, especially if it's more than once, is what's slowing the Morphbrush down, not the absolute polygon count of the figure itself. (Within reasonable limits, of course)
I always thought it was just the pure polygon count. If the morph brush is faster on plain meshes than subdivided ones, that would be useful to know.
I'd still prefer to use an external tool for bigger morph jobs, but even then, if I could load the morph the normal way onto a higher res mesh and transfer to the lower res one as a subdivision morph, that would potentially be easier than having to go through the script.
-- I'm not mad at you, just Westphalian.
JoePublic posted Sun, 26 December 2021 at 5:36 PM
"I always thought it was just the pure polygon count. If the morph brush is faster on plain meshes than subdivided ones, that would be useful to know."
I did a little test with my V4 "Zombie" morph.
V4 has 76181 polygons.
Morphing her "as is" with the Morphbrush is not a big problem.
But at SubD 1, everything turns very slow.
I exported her then at SubD 1 and imported her back as a welded prop which now had 264096 polygons.
Morphing that prop wasn't as effortless as morphing the unsubdivided V4 figure of course, but still noticeable easier than morphing the subdivided V4. Much less lag while brushing, less jerkyness, better smoothing.
Sorry, I can't help with the implementation. I don't speak "code". Just some practical observations here. :-)
But Poser already CAN transfer morphs between similar shaped figures that have wildy different mesh topology, and it can export a subdivided mesh as a prop. (You can for example transfer M2's muscle morphs over to M3, and if you use the M3toM2 figure, which features the M2 shape [But not the actual mesh], then the transfer is almost perfect)
So it seems that there is the capeability to "freeze" the subdivision inside Poser.
And then the "high res" morph might somehow be able to be transferred from the subdivided prop to the subdivided figure it was originally derived from.
If one could really manage to transfer a hi-res morph from a prop to a subdivided figure, this means that unecessary bodyparts could be left out.
In my example, I would only export the torso as a prop to work on the ribcage, freeing even more computer power for operating the morphbrush.
JoePublic posted Sun, 26 December 2021 at 5:41 PM
BTW, don't want to hijack the thread so I'll crawl back into my little hole in the wall.
Just thought there might be a way to circumvent the problem. ;-)
odf posted Sun, 26 December 2021 at 6:03 PM
And then the "high res" morph might somehow be able to be transferred from the subdivided prop to the subdivided figure it was originally derived from.
That's the crucial part. I know that Poser can transfer morphs between meshes/figures. If that also worked from a regular morph onto a subdivision morph, that would be perfect (assuming the quality is reliably good). I haven't heard of anyone doing that successfully, but that doesn't mean much.
JoePublic posted at 5:41 PM Sun, 26 December 2021 - #4432447BTW, don't want to hijack the thread so I'll crawl back into my little hole in the wall.
Just thought there might be a way to circumvent the problem. ;-)
No worries! The delta mangling necessary for subd morphs is messy enough to make circumvention an increasingly attractive option. :smile:
Seriously, there's a reason this is in Poser Technical, not Poser Python. I just want a way to get subdivision morphs into Poser, I don't really care how.-- I'm not mad at you, just Westphalian.
primorge posted Mon, 27 December 2021 at 4:26 PM
You probably both already know that...
If you import a figure's mesh as a grouped prop, create and load your morphs on that, and have an externally subdivided version in scene... you can export the base resolution version with the morph applied, externally subdivide it, and apply the morph to the higher res version prop in scene without winding order error. Not sure what use it would be other than transferring morphs between LOD versions of a figure, but just noting.
primorge posted Mon, 27 December 2021 at 4:38 PM
JoePublic are you saying that you are exporting Poser Subdivided props and not generating a 0 kb OBJ? I've only ever, I think, accidentally, exported from subdivided figure and it produces an empty OBJ file. Never tried a prop, I just assumed Poser's subdivision was strictly "in scene". Interesting.
odf posted Mon, 27 December 2021 at 4:54 PM
primorge: I can only speak for Poser 12, but I've been exporting subdivided props and figures without any problem. Well, one figure, Antonia. If there are figures it doesn't work for, I'd be interested to know.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 27 December 2021 at 4:59 PM
Holy crap it works! I've only ever returned 0 kb OBJ files for morph targets in scene subdivided for La Femme. Let me test something real quick...
primorge posted Mon, 27 December 2021 at 5:10 PM
Ok. Just ran a test
Loaded La Femme. Unimesh, 1 level of Subdivision. Exported neck as OBJ. Resulting OBJ 0 kb.
Loaded Antonia 1.2. Switched to Unimesh Skinning, 1 level Subdivision. Exported neck as OBJ. Resulting OBJ 0 kb.
Are you saying you can export OBJ body parts, from unimesh skinning, in Poser 12, and return a valid OBJ?
primorge posted Mon, 27 December 2021 at 5:11 PM
Going to run one more test with full figure export... one moment.
primorge posted Mon, 27 December 2021 at 5:15 PM
Exported La Femme, Full Body, 1 level Subdivision, include existing groups. 0 kb OBJ
Tried again, unchecked include existing groups. 0 kb OBJ.
In Poser 11, at least, it's not possible.
primorge posted Mon, 27 December 2021 at 5:25 PM
Ok. Bit of a non control there. Overwriting the same file with exports... cursory hover of file showing 0 kb. Opening the file up in text editor shows...
Actual results. Exporting a single body part produces a 0 kb OBJ. Makes sense considering "unimesh".
Exporting full figure grouped, 1 level, working subdivided file.
No groups, same result.
Anyway. Good to know. Full figure subdivided works. Who knows how long I would of just assumed it was only in scene.
primorge posted Mon, 27 December 2021 at 5:27 PM
Sorry for the barrage, in any case I learned something new and potentially useful.
Thanks.
primorge posted Mon, 27 December 2021 at 5:29 PM
Crawling back to my "Nova_Project" cave...
odf posted Mon, 27 December 2021 at 5:35 PM
For reference, I just tried a few more things and am getting the same results in Poser 12: exporting individual body parts produces 0 kB OBJ files. Exporting the body actor, even just the body actor with nothing else, works.
-- I'm not mad at you, just Westphalian.
JoePublic posted Mon, 27 December 2021 at 5:47 PM
I just made a test with David 3 (Unimesh rigging):
Poser 11 Pro can export subdivided figures.
Poser 11 Pro can export subdivided props.
Poser 11 Pro can not export subdivided bodyparts.
(But if you spawn a prop from a bodypart and subdivide it then, it can be exported like any other subdivided prop.)
Hope this helps. :-)
primorge posted Mon, 27 December 2021 at 6:23 PM
JoePublic posted at 5:47 PM Mon, 27 December 2021 - #4432497
๐I just made a test with David 3 (Unimesh rigging):
Poser 11 Pro can export subdivided figures.
Poser 11 Pro can export subdivided props.
Poser 11 Pro can not export subdivided bodyparts.
(But if you spawn a prop from a bodypart and subdivide it then, it can be exported like any other subdivided prop.)
Hope this helps. :-)
Probably not relevant but interesting new fact to me; if I export La Femme subdivided with include existing groups checked and import it into mudbox it loads a whole mesh obj intact with body part groups as selection sets. I can sculpt on the mesh across body parts (it's not unwelded).
However if I reimport said sculpt full body as load FBM it returns a wrong number vertices error.
Inspection of the exported mesh shows the groups were dumped.
If, after sculpting, I select an individual selection set (group: say hip in this instance) and Export from selection from mudbox and try to load as morph target on the hip, it returns a wrong number of vertices error. Inspection of the export obj in scene appears to be proper number of verts for subd level 1 hip. By comparison I spawn a prop from La Femme's hip, subdivide 1, and compare to mudbox export. Identical vert number.
I'm not sure what this indicates.
odf posted Mon, 27 December 2021 at 8:09 PM
Probably not relevant but interesting new fact to me; if I export La Femme subdivided with include existing groups checked and import it into mudbox it loads a whole mesh obj intact with body part groups as selection sets. I can sculpt on the mesh across body parts (it's not unwelded).
However if I reimport said sculpt full body as load FBM it returns a wrong number vertices error.
Interesting! I managed to load a subdivided and morphed mesh onto an equally subdivided prop via "Load Morph Target..." and got all excited, only to discover that the higher subdivision levels were completely ignored. Only the base level was changed by the morph.
So I stopped trying at that point, although I imagine there are workflows where this kind of thing could still be useful.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 27 December 2021 at 9:13 PM
I am now back to working on the script and think I have a good plan for figuring out the u- and v- directions. It will not be exactly what Poser uses (because I still haven't been gifted the source code :grin:), but that should be okay as long as the unmorphed mesh is reasonably smooth after subdividing, which is what one would expect.
I've also experimented with tricking the Poser Morph Tool into baking down the morph to the lower subdivision levels. I opened the morph for editing, checked "Bake down for subdivision", set the radius and strength to the minimum non-zero values, found a vertex and dragged at it a tiny bit. The morph tool seems to do the baking down for the full mesh after every individual edit, which might explain it's painful slowness, but is to our advantage here because after this little stunt the complete morph has been baked down.
I'll experiment with this some more to see if there are easier ways of triggering the baking. But honestly I'm really quite happy that this worked because doing the baking down in the script is looking increasingly impractical. :thumbsup:
-- I'm not mad at you, just Westphalian.
adp001 posted Tue, 28 December 2021 at 5:44 AM
Poser uses Pixar's OpenSubDiv. The source including demo code is available on GitHub: https://github.com/PixarAnimationStudios/OpenSubdiv
The detailed description and very detailed descriptions can be found here:https://graphics.pixar.com/opensubdiv/docs/intro.html
(By the way, the documentation is not only interesting for programmers - you can find interesting facts about SubD and what you should pay attention to when using it)
Could it be that the mysterious data has nothing to do with normals or UV, but somehow describes Bezier curves?
adp001 posted Tue, 28 December 2021 at 5:50 AM
@odf: I think there is helpful information here: https://graphics.pixar.com/opensubdiv/docs/far_overview.html#far-patchtable
odf posted Tue, 28 December 2021 at 6:31 AM
adp001 posted at 5:50 AM Tue, 28 December 2021 - #4432513
Thanks! Yes, that looks relevant. :thumbsup: Letโs hope that all that weirdness Iโm seeing is really just OpenSubdiv storing stuff efficiently and not some proprietary Poser magic.@odf: I think there is helpful information here: https://graphics.pixar.com/opensubdiv/docs/far_overview.html#far-patchtable
By the way, Iโm getting closer with the deltas. But they still get a bit crumbly where the mesh is dense (for example around the nipples) and texture seams are messed up. Hopefully the Pixar docs can tell me what Iโm doing wrong there.
-- I'm not mad at you, just Westphalian.
odf posted Tue, 28 December 2021 at 9:27 PM
Some more delta experiments! It's better in motion, but I couldn't be bothered to make an animation. Antonia's back at subd level 1 with a morph that has the delta [0.0, 0.01, 0.0] at every vertex. The left image shows the morph dialed at 0.0, right one at 0.1.
The tile pattern is wider than high. If you look closely, you can see the vertices move along the positive u direction. The interesting stuff happens at the texture seams and also at irregularities in the uv maps. At the seams, we see that one side wins, but it's not obvious how that side is picked. At the vertical seam down the back, it's consistently the right side. At the horizontal seam, it's sometimes the top and sometimes the bottom.
Spots with UV irregularities are also interesting. It seems to texture quads are split into pairs of triangles and then the uv coordinates interpolated for each triangle individually. As there are two ways to split each quad, there's another opportunity to "disagree" with Poser and mess things up.
-- I'm not mad at you, just Westphalian.
primorge posted Tue, 28 December 2021 at 9:59 PM
Have you considered getting in touch with Snarlygribbly? IIRC he had a subdivider script for Poser. He might be able to offer some thoughts. Though he doesn't appear often on the Forums anymore he has updated EZSkin for 12 and would probably respond to a sitemail.
primorge posted Tue, 28 December 2021 at 10:20 PM
Apparently he no longer has an account here but here are links to relevant threads that I could find. Must have been a tumultuous time in the Poserverse...
Breadcrumb trail...
https://www.renderosity.com/forums/threads/2855291/print
https://www.renderosity.com/forums/threads/2855662
There seems to be a theme throughout history where maverick idea-men operating alone are ground 'neath the wheels of uncaring machines, or their ideas appropriated by collective entities and their names becoming vaguely remembered footnotes... in less dramatic language if you prefer... a sort of Mandela effect ;)
primorge posted Tue, 28 December 2021 at 10:29 PM
...something about UVs and edges being set to hard/sharp... not sure if it's relevant.
odf posted Tue, 28 December 2021 at 10:56 PM
Thanks for the links! Could be quite useful.
Yeah, interpolating UVs and dealing with hard edges can be tricky. I made the mistake of using hard edges in lo-res Antonia. Probably the worst decision I made when modeling her.
-- I'm not mad at you, just Westphalian.
primorge posted Tue, 28 December 2021 at 11:17 PM
From a simply layman's recollections it does hinder UV drift when subdividing to a higher LOD.
primorge posted Tue, 28 December 2021 at 11:18 PM
...Almost like pinning verts
primorge posted Tue, 28 December 2021 at 11:19 PM
Just switch off the hardness after the subdivision...
odf posted Tue, 28 December 2021 at 11:20 PM
If his old email still works, I *could* potentially contact him. It's astounding how many Poser folks from back then I've exchanged emails with. :grin:
-- I'm not mad at you, just Westphalian.
odf posted Tue, 28 December 2021 at 11:24 PM
The trouble is, if you switch off the hardness and then want to subdivide again, you get drift again. Also, Poser does not always seem to smooth out curves formed by hard edges correctly. I've noticed that with the irises on low-res Antonia. Using two or three edge loops close together seems to be a more robust solution.Just switch off the hardness after the subdivision...
-- I'm not mad at you, just Westphalian.
odf posted Tue, 28 December 2021 at 11:50 PM
By the way, those funny texture artifacts in the picture above happened in (Poser's) subdivision. Here's the same tile pattern on the base mesh:
I'd have to double-check the subdivided UVs to see why, but I've noticed recently that Poser sometimes spontaneously "unwelds" random UV vertices. It also uses Catmull-Clark interpolation of UV positions, i.e. shifts around the UVs of the original vertices, which together with the unwelding leads to chaos. So all as usual in Poserland. :smile:
-- I'm not mad at you, just Westphalian.
primorge posted Tue, 28 December 2021 at 11:52 PM
odf posted Tue, 28 December 2021 at 11:58 PM
For reference, lo-res Toni's eyes rendered at subd level 3:
-- I'm not mad at you, just Westphalian.
primorge posted Wed, 29 December 2021 at 12:14 AM
Well...
I'm not going to bash your figure's eye design any more than I already have lol. It was a different age. How could you have predicted?
And let me rephrase an earlier statement "GI artifacts" with a more general edit of "Raytracing Artifacts"
Time for bed.
odf posted Wed, 29 December 2021 at 12:42 AM
Well...
I'm not going to bash your figure's eye design any more than I already have lol. It was a different age. How could you have predicted?
It's pretty bash-worthy. To my defense, I had been thinking that I would eventually just stick Bagginsbill's eyes into Antonia. But he never said the three magic words I was waiting for: "They are done." :grin:
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 12:56 AM
Here's a question for @adp001 or another Poser Python sorcerer: if I pull this off, it would be nice to eventually run the code from within Poser, possibly as part of the OBJ_ImExport script. The problem is that right now there seems to be no way to directly write subdivision deltas via Poser Python. So I imagine the script would have to write a PMD and associated PZ2 "by hand" and then (ideally) load and "execute" them in order to create the morph in the current session.
So are these things that one can do from within Poser, namely a) create a file with arbitrary content and then b) load that file back into Poser as if the user had opened it from the pose library?
-- I'm not mad at you, just Westphalian.
primorge posted Thu, 30 December 2021 at 9:58 AM
Apologies for the interruption odf, will repost your python query after this post.
I recall that I had rebuilt Antonia's eyes; based off the original meshes but retopo'd and welded/ constructed. Eye cover and cornea all one uncut piece. Iris, pupil, sclera all one uncut piece. Material set ups relatively the same, same naming however, though I did take a small liberty with the iris sclera boundry. I had initially wanted to remap the eyes in more of a claw Gen4 cut shell style but that would make Antonia's old maps useless, opted instead to match the cutting and placement of shells to match old UV style, very simple cuts and a project by normal unwrap as pre existing. Just need to do some shuffling with the shells and the eyes will be done. They will morph and subdivide excellently. Figure sometime this weekend I'll upload and post a link here. I'll include properly grouped/r-l named props and obj files so you can just load them and parent to existing actors. In the middle of some match centers/morph/"superConformer" stuff for Nova but I have off the next couple days so figure the link probably sometime Sunday night depending on how fiddly testing/matching shell position and scaling is with the UVs...
primorge posted Thu, 30 December 2021 at 10:08 AM
Here's a question for @adp001 or another Poser Python sorcerer: if I pull this off, it would be nice to eventually run the code from within Poser, possibly as part of the OBJ_ImExport script. The problem is that right now there seems to be no way to directly write subdivision deltas via Poser Python. So I imagine the script would have to write a PMD and associated PZ2 "by hand" and then (ideally) load and "execute" them in order to create the morph in the current session.
So are these things that one can do from within Poser, namely a) create a file with arbitrary content and then b) load that file back into Poser as if the user had opened it from the pose library?
adp001 posted Thu, 30 December 2021 at 11:12 AM
@primorge: Good work!
@odf: Yes, it can be done. Poser Python can load all type of Poser files (poser.Scene().LoadLoadLibrary....)
adp001 posted Thu, 30 December 2021 at 11:23 AM
@odf: Can you please describe how to create a hires morph manually with your script?
What happens in case of errors? Are there error codes as return? Does your script know parameters to control anything?
Does your script run under Poser or does a Python interpreter have to be present in the system?
odf posted Thu, 30 December 2021 at 3:33 PM
adp001 posted at 11:23 AM Thu, 30 December 2021 - #4432604
Okay, general disclaimer: I was jumping way ahead with my question about Poser integration. I like to have a general idea where the journey is going, but I'm still doing my steps one at a time. :smile: It's just good to know that in principle the script could be made "fully automatic."@odf: Can you please describe how to create a hires morph manually with your script?
What happens in case of errors? Are there error codes as return? Does your script know parameters to control anything?
Does your script run under Poser or does a Python interpreter have to be present in the system?
So currently my script does not run under Poser. It's a simple command line script that at the moment takes two OBJ files, one exported from Poser at the desired subdivision level, the other the output of sculpting the mesh in the external software. It also takes the subdivision level and the name of the morph to produce as options. The output are a pz2 and a pmd file.
To be clear, I can load the morph that this script produces into Poser, but it generally has some more or less bad artifacts, so won't likely be usable.
The next step will be to bake down the morph to base level and (for now) restrict the higher subdivision levels to displacement along the normals as we talked about before. In order to do that, instead of the hi-res Poser output I will instead need two full-figure OBJs exported from Poser at subd-0, one welded and one unwelded. Awkward, but I'll deal with fixing awkward (and with Poser integration) once it looks like the script is producing something useful. Same with error handling, thinking about what other parameters could be useful, speed etc.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 3:36 PM
Apologies for the interruption odf, will repost your python query after this post.
No worries! Nice work on the eyes.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 5:56 PM
The next step will be to bake down the morph to base level and (for now) restrict the higher subdivision levels to displacement along the normals as we talked about before. In order to do that, instead of the hi-res Poser output I will instead need two full-figure OBJs exported from Poser at subd-0, one welded and one unwelded.
Just a quick note on why I need three OBJs here. Poser defines the base (subd-0) level of a morph with individual sets of deltas per body part actor, but then for each higher level it's just one set of subd-deltas on the unified body actor. It also seems to export a subdivided figure mesh without body part grouping (although I might try some more export options to see if I can't coerce it into including the body part names). So the unwelded subd-0 mesh is what I want for the individual morph targets on the body parts, whereas the welded, subdivided and morphed mesh is good for making the hi-res deltas. The welded subd-0 mesh is my Rosetta Stone that helps me translate between the two (which I need for baking down and interpolating). It has the grouping that tells me where the body parts go, but it's also a unified mesh that I can match up with the hi-res OBJ.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 6:04 PM
PS: Instead of exporting the welded subd-0 mesh, I could in principle also use a small Poser Python script that derives the mapping between vertex numbers in the unwelded and welded mesh (there's a function in OBJ_ImExport for that that I could borrow) and writes that out into a file.
But unless I get stuck with my implementation, I'd rather not do that because having to use two script, one inside and one outside of Poser, just makes things even more awkward I feel.
-- I'm not mad at you, just Westphalian.
adp001 posted Thu, 30 December 2021 at 6:50 PM
So far, it all sounds quite complicated to use. I'm not sure if the few people who want to do hires morphs in Poser themselves wouldn't rather spend the 10 dollars a month (or 180 for perpetual) for ZBrush Core.
But what the heck: programmers just love it the hard way :)
I wouldn't include the whole thing in another script that doesn't directly fit the theme (hires morphs). Rather I would try to build an interface that hides the quite complex and likely error-prone process. The UI can export the figure from Poser in two or three copies as desired (including the vertex crosslist if necessary) and then feed your script with the filenames and parameters. Then the UI can take the sculpted result and assign it to the correct figure.
Each step can be accompanied with appropriate text or images in the UI.
The user just has to remember the export directory and put the result from Blender there.
Biggest hurdle for the user could be: Installing the Python interpreter with all the necessary libs....
odf posted Thu, 30 December 2021 at 7:27 PM
So far, it all sounds quite complicated to use. I'm not sure if the few people who want to do hires morphs in Poser themselves wouldn't rather spend the 10 dollars a month (or 180 for perpetual) for ZBrush Core.
I don't think it has to be complicated to use in the end. I just personally find it easier to hammer down the functionality before I worry about UI/UX.
But what the heck: programmers just love it the hard way :)
Yeah, part of the reason I keep at it is because there are interesting little challenges in the programming that I can learn from, and I can probably reuse parts of the code in other projects. Being able to use not-ZBrush would be nice, but if it turns out to be too hard, well, I guess I'll bite the bullet and re-lean ZBrush.
I wouldn't include the whole thing in another script that doesn't directly fit the theme (hires morphs). Rather I would try to build an interface that hides the quite complex and likely error-prone process. The UI can export the figure from Poser in two or three copies as desired (including the vertex crosslist if necessary) and then feed your script with the filenames and parameters. Then the UI can take the sculpted result and assign it to the correct figure.
Oh, I think there's a misunderstanding here. Eventually, I'd like to run everything within Poser, not via an external script. Then ideally the user would just select a figure, pick the desired subdivision level and load the OBJ for the morph. All other necessary information can either be obtained directly from Poser or computed by my algorithms. Since we can't set subd deltas directly in Poser (as far as I'm aware), we'll have to then export a PMD, but as you mentioned above, we should even be able to then load that into Poser, so the morph would be available right away after the script had finished. And I agree, it's probably better to just have a dedicated script just for hi-res morphs, nothing else.
Like I said, all of these should become unnecessary. The whole thing should eventually just work like "Load full body morph...".Each step can be accompanied with appropriate text or images in the UI.
The user just has to remember the export directory and put the result from Blender there.
Biggest hurdle for the user could be: Installing the Python interpreter with all the necessary libs....
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 7:37 PM
PS: The only reason for splitting the script into an in-Poser UI and an external command line part would be if "someone" were really keen on starting on the UI right away. Of course I could not hold that imaginary person back, but I'd have to point out again that there's still a pretty high probability that the whole project will flop. And at any rate, the end goal would then be to bring the external code "home" into Poser and get rid of any temporary files with data we can easily recreate via the Poser API.
-- I'm not mad at you, just Westphalian.
adp001 posted Thu, 30 December 2021 at 7:51 PM
Maybe I should sleep more after all. Do not know what I read that sounded complicated :)
Ok, so all no problem. So a little UI with WxPython I do on a boring weekend in between. No reason to work something in advance. Just give me a holler when it's time.
odf posted Thu, 30 December 2021 at 7:59 PM
Maybe I should sleep more after all. Do not know what I read that sounded complicated :)
Probably the way I wrote it. :smile:
:thumbsup:Ok, so all no problem. So a little UI with WxPython I do on a boring weekend in between. No reason to work something in advance. Just give me a holler when it's time.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 9:42 PM
[...] It also seems to export a subdivided figure mesh without body part grouping (although I might try some more export options to see if I can't coerce it into including the body part names).
Just a quick update for reference: it turns out that the "include existing groups in polygon groups" option *does* produces an OBJ with body part names in the groups, even for a subdivided unimesh figure.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 30 December 2021 at 10:59 PM
Another interesting tidbit I just noticed, slightly annoying but no show-stopper: when I export a welded figure mesh from P12 at base resolution, the dead vertices left over from welding are removed. But when I export at subdivision level 1, they are preserved. So the numbering of the base level vertices in both meshes doesn't quite match up.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 31 December 2021 at 3:21 PM
Greetings from the year 2022, a.k.a. Year 3. I've had some shower thoughts on what to call this thing when it's unleashed. My favourites so far: GoNutz and BloodyAwfulMorphLoader. :grin:
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 31 December 2021 at 3:50 PM
Presto Morph Express
primorge posted Fri, 31 December 2021 at 3:53 PM
However, if it doesn't do post transform as well would be an unfortunate oversite :(
primorge posted Fri, 31 December 2021 at 3:57 PM
Abracadabra
odf posted Fri, 31 December 2021 at 4:06 PM
That would be an extremely ironic name. Due to be written in pure Python and handling large meshes, it will be painfully slow.Presto Morph Express
-- I'm not mad at you, just Westphalian.
odf posted Fri, 31 December 2021 at 4:07 PM
I'm pretty sure I want post transform, but I have no idea how that works.However, if it doesn't do post transform as well would be an unfortunate oversite :(
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 31 December 2021 at 4:59 PM
odf posted at 4:06 PM Fri, 31 December 2021 - #4432671
primorge posted at 3:50 PM Fri, 31 December 2021 - #4432665Slotho Exchango Loader lol.That would be an extremely ironic name. Due to be written in pure Python and handling large meshes, it will be painfully slow.Presto Morph Express
Sorry, going back to work now.
Post transform is like default subtraction into a difference final result only automated. I think ADP's script does this. As does PML and Goz. Also is built into Poser, but in a confusingly worded or implemented way.
Back in the day I would do it through multiple morph dials. For instance I want to fix something inside a figure's mouth. It would be easier to create the fix morph if the mouth is open. Problem is by convention the open mouth position will obviously be baked into the export, thus becoming a default state. After morphing over this new default state any pre-existing morphs (mouth open in this example) will be added to the final resulting import target causing telescoping. Not pretty.
So you would think, ok, I'll just dial the mouth open to 0 and apply the target. Mouth pops open again after the target is applied. Ok, so I dial the original mouth open to -1 and export the result as a morph target (or spawn in scene if you prefer, I like to do these things via export, less likely for me to botch something). Apply the new morph, and there's your difference as a working dial. Same can be done with rotations, baking as morph and subtracting... voila JCMs. Supposedly, in theory, I didn't really start messing with JCMs until post transform became an automation rather than a tedious dial process. Overlaying difference morphs, all the time.
primorge posted Fri, 31 December 2021 at 5:29 PM
One final observation on your comment about Python speed... I've noticed that PML takes a bit to toggle through all the actors in a source figure, noting deviations in specific actors. Sometimes very slow if you haven't shut down and restarted Poser. This is with default resolution figures. GoZ is pretty fast with SubD figures, so I imagine GoZ is not Python. Probably seems like a doofus observation from a programmer perspective but I'll risk mentioning it.
odf posted Fri, 31 December 2021 at 5:59 PM
Yeah, pretty sure ADP's script supports post-transform, so I can study his code and/or bother him with questions. I definitely want to be able to do hi-res JCMs.
It should only matter for the base resolution part of the morph, though. As mentioned at length, the higher level details are expressed relative to the surface directions, so I'd expect them to be completely agnostic to the pre/post transform thing.
By the way, I've got a slightly revised devious plan: since there's a fair deal of busy work involved in the base resolution part of the morph (which as it seems I can't skip, at least not at this stage), and that part is indistinguishable from a regular FBM, I think I'll first write a command line script that converts an OBJ into an FBM, then integrate that into Poser with a reasonably friendly GUI, and finally add on the hi-res stuff.
Also by the way, some of the slow bits (such as parsing and composing OBJ files in Python) should go away when the script runs under Poser. Other slow bits will still be slow, but some of them (like the vertex order repair function) can probably be made optional. There's some number crunching remaining for the hi-res stuff that needs to be done and probably can't be delegated to Poser. So that's most likely what will slow the script down. Might be fun to find some trickery to still make it fast, but I'll leave that for later.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 31 December 2021 at 6:08 PM
primorge posted at 5:29 PM Fri, 31 December 2021 - #4432679
One can write very fast Python scripts if all the heavy number crunching happens in a compiled library (usually C/C++ or such). For all I know, GoZ could be mostly Python with just a little bit of native code.One final observation on your comment about Python speed... I've noticed that PML takes a bit to toggle through all the actors in a source figure, noting deviations in specific actors. Sometimes very slow if you haven't shut down and restarted Poser. This is with default resolution figures. GoZ is pretty fast with SubD figures, so I imagine GoZ is not Python. Probably seems like a doofus observation from a programmer perspective but I'll risk mentioning it.
Inside Python we can also use Numpy which does number crunching in bulk. It's great for things like adding up large lists of numbers, but it actually slows things down when instead it's fed many small lists of numbers. So the trick then is to organize the computation so that it plays to Numpy's strengths.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 31 December 2021 at 6:24 PM
Finally there's Numba which can turn regular Python code into machine code *while* the script is running (some call this just-in-time or JIT compilation). Like Numpy, it needs the code to be written in special ways to be effective, but those special ways are much easier to achieve than Numpy's special ways. It's the best thing since sliced bread, but not included in Poser, so can't be relied on it for this project.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 31 December 2021 at 6:44 PM
Thanks for the explanations odf. And
Happy New Year
odf posted Fri, 31 December 2021 at 7:35 PM
Thanks for the explanations odf. And
Happy New Year
Happy New Year to you!
-- I'm not mad at you, just Westphalian.
odf posted Sat, 01 January 2022 at 1:19 AM
OTME: OBJ To Morph, Eventually
SAME: Save as morph, eventually
-- I'm not mad at you, just Westphalian.
odf posted Sat, 01 January 2022 at 2:06 AM
Step 1 of revised devious plan is done: I seem to have achieved a command line FBM creation script. Nothing exciting of course, except that it can repair vertex order and tolerate missing body parts and such as long as the grouping is intact.
-- I'm not mad at you, just Westphalian.
adp001 posted Sat, 01 January 2022 at 9:19 AM
Are you sure you want groups for full body morphs? I for sure don't.The reason: Sculpting with a brush is impossible if the model is not welded.
A full body morph is nothing else than what my script does in the end.There is a "master dial" in the body that triggers all morphs in the individual parts (that's basically the whole description for "FBM"). If you first save a figure in the default pose with my script, then use that in Blender to sculpt and let the script load the result, you get a typical FBM.
In the parts where nothing was changed, no morph is created at all. This works reliably.
About the speed of Python: My script doesn't do any great mathematical stunts, but saves and loads OBJ files via Python (without Poser) and does a few things with the morph data. Also without Posers help beside the Python API. After the first time (the image files are copied to the project directory if desired so that the model doesn't look so naked in Blender :) ) saving and loading is very fast on my machines. On my oldest laptop with an old I5 Dual-Core, 8 Gig memory and SSD it is still reasonably fast. In Poser clicking the scripts "save" button, then switch to Blender and load OBJ goes quickly without a coffee break. And vice versa.
In Python, even more sophisticated things can be surprisingly fast in my opinion: I made a script some month ago to rotate and move morphs in Poser. In "real time". A cube is loaded, which can be moved with the mouse. The morph follows the movements reliably. I used Numpy for this. The script spends most of its time doing callbacks when you move a dial or the cube.
My "trick" when loading OBJ files: react only on "v" lines and abort as soon as something else comes. Because for morphs you only need the vertices, nothing else. Sure: It can happen that someone creates an OBJ file with a modeler that doesn't output the data sequentially. But this is quite rare (nowadays). To slow down the majority of users because of that I think is nonsensical. I'm happy to send users who are affected a version that works for them - which is then unfortunately correspondingly slower.
adp001 posted Sat, 01 January 2022 at 10:46 AM
"Post Transform":
Change = (Changed data set - Original data set)
Or in poser language:
Morph = (Sculpted figure - Original figure)
Or also:
delta_array == modified_vertex_array - original_vertex_array.
Each morph is a "post transformation" - or have I misunderstood something again?
primorge posted Sat, 01 January 2022 at 11:45 AM
adp001 posted at 10:46 AM Sat, 1 January 2022 - #4432712
Shrug. In Poser parlance apparently Transform is a translation, rotation, or scale. As in when you save a pose the option to save Body transformations. I seem to recall someone mentioning that it also pertains to the position of morph calculation as written into the file, as in how JCMs are written, don't hold me on that one though. I don't see why your interpretation would be wrong, makes sense in an encompassing kinda way."Post Transform":
Change = (Changed data set - Original data set)Or in poser language:
Morph = (Sculpted figure - Original figure)Or also:
delta_array == modified_vertex_array - original_vertex_array.
Each morph is a "post transformation" - or have I misunderstood something again?
odf posted Sat, 01 January 2022 at 5:12 PM
Shrug. In Poser parlance apparently Transform is a translation, rotation, or scale. As in when you save a pose the option to save Body transformations. I seem to recall someone mentioning that it also pertains to the position of morph calculation as written into the file, as in how JCMs are written, don't hold me on that one though. I don't see why your interpretation would be wrong, makes sense in an encompassing kinda way.
Thanks for the manual excerpt! I'm used to a wild mixture of "official" and community parlance from back when, so it's good to know that this is canon. I'll check out what the conversion does.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 01 January 2022 at 5:42 PM
Are you sure you want groups for full body morphs? I for sure don't.The reason: Sculpting with a brush is impossible if the model is not welded.
I'm not quite sure what you're getting at. A welded model can still have groups. Also, I wouldn't make a morph loader that required groups in every case. This is just a stepping stone.
A full body morph is nothing else than what my script does in the end.There is a "master dial" in the body that triggers all morphs in the individual parts (that's basically the whole description for "FBM"). If you first save a figure in the default pose with my script, then use that in Blender to sculpt and let the script load the result, you get a typical FBM.
Yes thanks, that's what I meant when I said my script made an FBM. What did you think I meant? :wink:
In the parts where nothing was changed, no morph is created at all. This works reliably.
And if an FBM was all I wanted, I would simply use you script or Poser's "Load full body morph..."
My "trick" when loading OBJ files: react only on "v" lines and abort as soon as something else comes. Because for morphs you only need the vertices, nothing else.
That's a good idea! I agree that for the "end product" the fancy options should not make things slow by default.
I don't think I want to discuss Python speed any more until I have a working prototype. Make it work, then make it fast. Happy to talk optimization strategies then.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 01 January 2022 at 6:19 PM
PS: To clarify, the purpose of step 1 was to make sure I know how to make an FBM in the form of a PZ2 and PMD that Poser can load correctly. That's the whole point. The rest was just my undisciplined brain getting excited about cool unrelated stuff I can do. I'd promise not to do it again, but I'd have to break that promise. :grin:
-- I'm not mad at you, just Westphalian.
odf posted Sun, 02 January 2022 at 6:27 PM
Another baby step done, as seen below. I can now take a higher resolution morph (left, OBJ exported from Poser at subd 2, sculpted in Blender and returned into Poser as a prop) and bake it down into a base-resolution FBM (right, applied to Antonia and shown at subd 2). Obviously the high-frequency details are lost so the remaining step will be to recover those. This means subdividing the subd 0 mesh with the baked-down morph applied and work out the normal-based deltas between that and the full-detail morph. I've got all the ingredients at this point (except for the on-demand normal computation, which should be easy), so it's just a matter of putting it all together.
-- I'm not mad at you, just Westphalian.
odf posted Sun, 02 January 2022 at 10:55 PM
Okay, getting there. I've got a script that should do what I want but so far Poser isn't quite happy. So I'll be spending some time debugging. In the meantime, here's evidence that I can combine a subd-0 FBM morph with hi-res detail in a second morph to reproduce my original Blender sculpt. All that's missing is doing it all in a single morph.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 03 January 2022 at 12:51 AM
Hmm, I'm going to need a test example that's a bit easier to verify visually. :smile:
Anyway, before someone else says it: yes, I realize that it's trivial to turn two morphs into one by linking them together. It's just a matter of automating things as much as possible to make the process easier.
I quite like the two-step process I used here, though, because Poser/OpenSubdiv took care of pretty much all the computations that would be difficult to do reasonably fast in Python, namely the computation of new vertex positions and normals for the subdivided mesh. So maybe the best course of action would be to streamline it as much as possible, ideally within a script that runs in Poser and does everything automatically.
So here are the steps, more or less:
- From the hi-res OBJ file out of Blender etc, grab the vertex positions that correspond to base (subd-0) mesh vertices and use them to make an FBM (we have various ways to do that, including ADP's script).
- With the FBM applied in Poser, go to subdivision level 2 and get the full set of vertex positions and normals from Poser (I'm assuming that can be done directly via the API, but making Poser write an intermediate file is also acceptable).
- Now use that and the original OBJ to generate the subdivision deltas and write out the PMD and PZ2 for the corresponding subdivision morph (assuming we can't make the subd morph directly in Poser, which would of course be much nicer).
- Now still within Poser, load in that new morph and link it together with the first morph.
- ...
- Profit! :smile:
-- I'm not mad at you, just Westphalian.
odf posted Mon, 03 January 2022 at 9:11 PM
Behold my ingenious, masterly crafted device by which I shall lift the veil off Poser's darkest subdivision secrets: The Pimple of Truth*.
* It's the one on her nose.**
** It could also be a wart, but that's not important right now.
-- I'm not mad at you, just Westphalian.
odf posted Tue, 04 January 2022 at 2:26 AM
And finally, sweet, sweet success (in that this is the actual Antonia with a pimple morph rather than a fixed high-res prop impersonating her):
The script is not fast yet, nor is it running inside Poser, but it's werking. Huzzah!
Next up:
- Bake down to all the intermediate subd levels (currently only the base and highest level are done).
- Test on some more interesting morphs.
- Do a first optimization round, focusing on things that likely still need to be done in the Poser version.
- Convert into a "manual" Poser workflow by implementing the required operations that Poser does not yet support natively as small Poser Python scripts.
- Work towards full "one-click" automation.
- More optimization.
-- I'm not mad at you, just Westphalian.
primorge posted Tue, 04 January 2022 at 6:13 AM
Congrats.
Also
The eyebrow bump is misaligned with the eyebrow diffuse texture.
odf posted Tue, 04 January 2022 at 3:04 PM
primorge posted at 6:13 AM Tue, 4 January 2022 - #4432835
I know, right? Some weird little issues with this particular version of the head texture. Sorry for not fixing that and making your eyes hurt!Congrats.
Also
The eyebrow bump is misaligned with the eyebrow diffuse texture.
PS: I did fix the sclera bump though which I had set far too high in my default materials.
-- I'm not mad at you, just Westphalian.
odf posted Wed, 05 January 2022 at 1:28 AM
Behold, the days of half-baked subdivision morphs are over, as demonstrated here by my lovely assistant (who assures me she is not even a little bit baked, like definitely less than a third baked).
From left to right: subd-0 unmorphed for reference, morph (at full strength) at subd-0, subd-1, subd-2.
This took a generous 94 seconds on my machine for wrangling roughly 600k vertices. So for my next trick, I'll be savagely hacking away at that number.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 06 January 2022 at 1:10 AM
So, I've cut the execution time for the script in half after the first day of optimizing. Now if I can keep that up for a few more days... :smile:
-- I'm not mad at you, just Westphalian.
odf posted Fri, 07 January 2022 at 1:21 AM
Down to 33 seconds, a minute faster than the first complete version that worked. Optimizing is fun.
-- I'm not mad at you, just Westphalian.
adp001 posted Fri, 07 January 2022 at 4:13 AM
Weekend is coming. If you have a version ready to make a UI for: I'm ready to try my best.
odf posted Fri, 07 January 2022 at 5:18 AM
Thanks, but there's still a lot of work to be done. First, more optimizations, then integration into Poser. Maybe the weekend after, if things go smoothly.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 07 January 2022 at 5:21 PM
So far I've vectorized most of the number crunching stuff with the notable exception of computing the normals for the subdivided mesh. I'll get to that, but right now the biggest chunks of "extra" execution time are the parsing of the OBJ file and the translation of welded vertex numbers to per-body-part vertex numbers for the base-resolution FBM. The latter is of course something I'll get from Poser for free, but I'd like the standalone script to be fast, too, because command line scripts can be great for things like batch processing. For the same reason, I'd like to be able to read some of the face information from the morphed OBJ, as well, without too much of a performance hit. I don't actually need that face information directly, but I use it to identify the "ghost" vertices left over from welding.
Both of these things are "first thing that works" implementations as it stands, so I'd definitely expect there to be room for improvement.
Sorry if all that sounds confusing. Just ignore it if you're not interested in these details, and ask me to elaborate if you are.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 07 January 2022 at 5:28 PM
PS: I should maybe note that the "can be gotten from Poser for free but need a bit of work stand-alone" things could also be pre-computed and stored in files if necessary. So if they do turn out to be difficult to speed up, I'm prepared to just leave them be.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 07 January 2022 at 6:43 PM
When all is said and done how are you releasing this?
odf posted Fri, 07 January 2022 at 7:18 PM
primorge posted at 6:43 PM Fri, 7 January 2022 - #4432970
*shrugs*When all is said and done how are you releasing this?
Well, first of all github, obviously. The code already lives there, and since I reckon Microsoft purchased the company in order to milk rather than slaughter it, I don't think it will go away anytime soon. Github let's one make releases, so there'll be a link to an "official" zip-file that I could tell folks about if Rendo allowed that. :wink:
I don't know about other outlets. I could put it in FreeStuff if that's not too much hassle. Could also put a copy on the Antonia site, even though obviously it's not Antonia-specific. Any other suggestions?
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 07 January 2022 at 7:39 PM
Well smoke signals perhaps. Or an elaborate handshake. Or you could just burn it to disk and toss it into random suburban backyards.
Just kidding.
Sounds good. Looking forward to experimenting with it. Watching with interest.
odf posted Fri, 07 January 2022 at 11:15 PM
primorge posted at 7:39 PM Fri, 7 January 2022 - #4432973
Not suggesting that you should, but if you're not afraid of command line Python and can't wait for the Poser version, you can grab the code from github at (almost*) any time and just try it out. I haven't included instructions yet, but they're simple and I could add them quickly. :wink:Sounds good. Looking forward to experimenting with it. Watching with interest.
* I've got a habit of only checking in code that I've tried out and that appears to work. Of course I'm not infallible, hence the "almost."
-- I'm not mad at you, just Westphalian.
odf posted Sat, 08 January 2022 at 1:46 AM
And we're down to 26 seconds at the end of this optimization day. I had some trouble speeding up the OBJ parsing code at first, but then I took ADP's advice seriously and wrote a custom function that ignores all the parts I don't need. It's also quite short, as a by-product of less Python code in a tight loop generally translating to faster execution times.
-- I'm not mad at you, just Westphalian.
odf posted Sun, 09 January 2022 at 12:54 AM
Down to a much more reasonable 15 seconds now. Got the normal computation much faster by heavily Numpy-fying it and finally understood properly how welding and ghost vertices work in Poser, which made the unimesh to actor translation for the base level morph pretty trivial.
The OBJ parsing now takes five seconds and almost all the rest is to do with subdivision. I think I'll give it one more day for optimizing the latter and then start on the Poser integration.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 10 January 2022 at 12:46 AM
Now 13 seconds, seven times as fast than my original version. Not too shabby if I say so myself! I'm almost ready to move on to the next phase, but there's one more thing I'd like to try (as if there wasn't always one more thing to try).
-- I'm not mad at you, just Westphalian.
odf posted Tue, 11 January 2022 at 12:52 AM
Daily update: no new optimizations today. I couldn't get that "one more thing" to work and called it quits. Instead, I tested my code with Python 2.7 (I seem to recall someone requested 2.7 compatibility :wink:) and fixed a few small problems which that unearthed. I also started my exploration of the relevant portions of the Poser API.
Interesting tidbit: if I import my test-OBJ (morphed Antonia at subd level 2 with ca. 600k vertices) from the Poser Python shell, it takes 5 seconds, which incidentally is exactly the same time my command line script takes for loading that OBJ (although my script does not read in the UV coordinates, so it's not a fair comparison). When I import from the File menu, thought, it takes Poser 12 seconds. Any idea where the extra 7 seconds come from?
-- I'm not mad at you, just Westphalian.
odf posted Wed, 12 January 2022 at 12:51 AM
Daily update: on my way to a first version that runs within Poser and grabs the base geometry from the current figure. Looking good so far, but I'm sure I'll run into the Poser gremlins eventually.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 13 January 2022 at 12:55 AM
Daily update: I appear to have successfully implanted the script into Poser. Grabbing the correct mesh data from the API (as in, matching what Poser would produce in unwelded/welded OBJ exports) proved more tricky than expected, but I got that working literally fifteen minutes ago.
Tests and cleanups to be scheduled for tomorrow. Also I'll see if I can get profiling to work from inside Poser so that I have a better idea of which parts take how much time. The general impression though is that the script is not significantly slower than the command line version.
ETA: Probably time to move to the Poser Python forum soon, since from here on I suspect it'll be mostly about GUIs and APIs.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 12:44 AM
Okay, I think I've got a script that's streamlined enough to unleash onto a courageous few beta testers - a.k.a. whoever still follows this thread.
Get the code as a ZIP file from here or for git users, clone from https://github.com/odf/pydeltamesh.git.
Vague instructions:
A) Exporting the unmorphed mesh:
In Poser, load the figure, set to unimesh and the desired subdivision level, then export as OBJ with "As Morph Target" checked (sorry, no post-transform support at this point).
B) Making the morph:
In your favorite tool, import the OBJ from A), sculpt it to your heart's desire and export as OBJ again. Important: make sure to keep the vertex order and also (for now) the disconnected vertices intact.
C) Loading the morph:
Back in Poser, load the figure with the same settings as when exporting and make sure it's selected, then start the script via "Run Python Script" in the file menu. The path to use for the script is pydeltamesh/poser/loadSubdMorph.py within the unzipped folder.
The script asks for a name to give to the morph (if none given, the OBJ file name is used) and then pops up a file selection box where you can pick the OBJ created in B). After that, my Poser just seems to sit there for a few seconds before even showing a progress spinner, but eventually you should see a text box pop up with some diagnostics, and a dial for the new morph should appear.
-- End of vague instructions --
That's it for now. Undoubtedly there will be many bugs, so please be gentle with me.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 14 January 2022 at 7:18 AM
The unwelded bit is a no go.
Will this work with a prop version of the figure, subdivided, in scene? That is, in order to skip the unwelding bit. Transferring the working fbm from the prop to the figure, or even cleaning up any local empties, is trivial.
primorge posted Fri, 14 January 2022 at 7:22 AM
Pre emptive. Python3?
primorge posted Fri, 14 January 2022 at 7:36 AM
Actually, I have more questions. It'll have to wait til I'm done work this evening.
odf posted Fri, 14 January 2022 at 9:13 AM
primorge posted at 7:18 AM Fri, 14 January 2022 - #4433229
Which unwelded bit are you referring to? There is no part of what I described that requires or would work with an unwelded mesh.The unwelded bit is a no go.
Will this work with a prop version of the figure, subdivided, in scene? That is, in order to skip the unwelding bit. Transferring the working fbm from the prop to the figure, or even cleaning up any local empties, is trivial.
If you are interpreting โdisconnected verticesโ as unwelded, my apologies. I mean that Poser exports a mesh with some vertices that are not in any faces, and you need to leave those in. Blender does that with โkeep vertex order,โ if it is a problem with other software, I can fix that.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 9:22 AM
Iโve tested with Python2 as far as I could, but I canโt test with Poser 11 because I donโt own that. My hope is that if anything breaks in P11, there will be a useful stack trace.Pre emptive. Python3?
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 14 January 2022 at 9:44 AM
I see, isolated vertices. I'll test in 11 via mudbox when I get home. Thanks for the clarification.
adp001 posted Fri, 14 January 2022 at 11:54 AM
Loaded LaFemme Pro "out of the box" (no morphs). Saved it as ob-file with subD 2.(Slowwww.....).
Loaded the object into Blender. Made some changes and exported to another file.
Started your script: loadSubdMorph. Got an error message, because Python 2.7 doesn't know "Process time". Changed that to "time" and restarted. After a long while I got this with an error at the end:
Loading mesh from \\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\Data\untitled_blender.obj...
Loaded mesh with 387296 vertices and 383752 faces.
Subdividing for baking...
Subdivided 3 times.
Finding deltas for level 0
Found a total of 11565 deltas for 83 actors.
Subdividing morph for level 1...
Subdividing for baking...
Subdivided 2 times.
Computing vertex normals...
\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\makeSubdivisionMorph.py:294: RuntimeWarning: invalid value encountered in divide
return ns / ds
Finding deltas for level 1...
Found 2884 deltas.
Subdividing morph for level 2...
Subdividing for baking...
Subdivided 1 times.
Computing vertex normals...
Finding deltas for level 2...
Found 10077 deltas.
Subdividing morph for level 3...
Computing vertex normals...
Finding deltas for level 3...
Traceback (most recent call last):
File "\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\poser\loadSubdMorph.py", line 66, in <module>
loadSubdMorph(name, path)
File "\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\poser\loadSubdMorph.py", line 27, in loadSubdMorph
targets = makeTargets(name, unwelded, welded, morph, used, verbose=True)
File "\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\makeSubdivisionMorph.py", line 94, in makeTargets
name, weldedMorphed, morph, complexes, verbose
File "\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\makeSubdivisionMorph.py", line 254, in makeSubdTarget
deltas, displacements = findSubdDeltas(verts, morphedVerts, normals)
File "\\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\makeSubdivisionMorph.py", line 269, in findSubdDeltas
diffs = morphedVertices[: len(baseVertices)] - baseVertices
ValueError: operands could not be broadcast together with shapes (387296,3) (1539199,3)
adp001 posted Fri, 14 January 2022 at 12:07 PM
Tried another time to make sure I had anything correct. This time with subD 1: similar output, same error at the end.
odf posted Fri, 14 January 2022 at 1:07 PM
Isolated! Much better word!I see, isolated vertices. I'll test in 11 via mudbox when I get home. Thanks for the clarification.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 1:10 PM
Thanks! Weird error! Iโll see if I can reproduce it somehow with what I have.Tried another time to make sure I had anything correct. This time with subD 1: similar output, same error at the end.
-- I'm not mad at you, just Westphalian.
adp001 posted Fri, 14 January 2022 at 1:46 PM
ValueError: operands could not be broadcast together with shapes (387296,3) (1539199,3)
Smells like a numpy error :)
I found this hint:
This error occurs when you attempt to perform matrix multiplication using a multiplication sign (*) in Python instead of the numpy.dot() function.
adp001 posted Fri, 14 January 2022 at 1:49 PM
https://stackoverflow.com/questions/24560298/python-numpy-valueerror-operands-could-not-be-broadcast-together-with-shapes
odf posted Fri, 14 January 2022 at 2:54 PM
adp001 posted at 1:46 PM Fri, 14 January 2022 - #4433255
It's certainly numpy that's complaining, but the problem is with the inputs, namely here that the morphedVertices array does not have enough rows (i.e. vertices). So the actual error would have happened earlier, possibly in a different numpy operation, but it could also be something else.ValueError: operands could not be broadcast together with shapes (387296,3) (1539199,3)
Smells like a numpy error :)I found this hint:
This error occurs when you attempt to perform matrix multiplication using a multiplication sign (*) in Python instead of the numpy.dot() function.
My hope is that I made a last minute change that I forgot to check for Python 2.7 safety, because that would be easy to reproduce and debug. If it's a P11 versus P12 thing, things will be trickier.
PS: By "weird error" I didn't mean that I was unfamiliar with this kind of error message. I'm very familiar, it's basically numpy's way of saying hello. I meant it's weird specifically that this kind of error happened on that line.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 3:07 PM
Oops, I just noticed: you used LaFemme at subD 2 but my script subdivided the subD 0 mesh it grabbed from Poser three times. That hints at either a mismatch between the two meshes or some edge case in the subdivision algorithm that I haven't covered. If it's something about LaFemme's mesh (and assuming the Pro version, which I don't have, doesn't use a modified mesh), I should see the same problem appear in P12.
I'll have a go at this after breakfast.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 4:28 PM
So, I've tried it with LaFemme. Works in Poser 12, works in Python 2.7 on the command line. But I'm getting different numbers for the mesh:
Loaded mesh with 386835 vertices and 383504 faces.
Whereas you had:
Loaded mesh with 387296 vertices and 383752 faces.
So I'm still not sure if that's a P11 versus P12 thing, or LaFemme versus LaFemme Pro. It might be interesting to try a different figure. And it's probably useful if I add some more diagnostics and make the script bail out right away if the face counts don't match.
-- I'm not mad at you, just Westphalian.
adp001 posted Fri, 14 January 2022 at 4:30 PM
With a loaded LaFemme:
fig=sc.CurrentFigure()
[ac.Name() for ac in fig.Actors() if not ac.IsBodyPart()]
[u'CenterOfMass', u'GoalCenterOfMass', u'rThumb0Con', u'rIndex0Con', u'rMid0Con', u'rPinky0Con', u'rRing0Con', u'lThumb0Con', u'lIndex0Con', u'lMid0Con', u'lRing0Con', u'lPinky0Con', u'UpperBrowCon', u'RightOuterBrowCon', u'LeftOuterBrowCon', u'MidBrowCon', u'RightInnerBrowCon', u'LeftInnerBrowCon', u'RightUpperEyeLidCon', u'LeftUpperEyeLidCon', u'RightLowerEyeLidCon', u'RightUpperCheekCon', u'LeftUpperCheekCon', u'RightLowerCheekCon', u'LeftLowerCheekCon', u'NoseCon', u'RightNostrilCon', u'LeftNostrilCon', u'UpperLipCon', u'RightUpperLipCon', u'RightLipCon', u'LeftUpperLipCon', u'LeftLipCon', u'RightEarCon', u'RightEarLobeCon', u'LeftEarLobeCon', u'LeftEarCon', u'LeftLowerEyeLidCon', u'RightBreastCon', u'LeftBreastCon', u'RightGluteCon', u'LeftGluteCon', u'LowerLipCon 1', u'RightLowerLipCon 1', u'LeftLowerLipCon 1', u'JawCon', u'RightLabiaCon', u'LeftLabiaCon', u'MonsPubisCon']
All this actor are not bodyparts. But they will be exported and interfere a lot when sculping in Blender.
adp001 posted Fri, 14 January 2022 at 4:32 PM
odf posted at 4:28 PM Fri, 14 January 2022 - #4433264
LaFemme Pro shouldn't make a difference.So, I've tried it with LaFemme. Works in Poser 12, works in Python 2.7 on the command line. But I'm getting different numbers for the mesh:
Loaded mesh with 386835 vertices and 383504 faces.
Whereas you had:
Loaded mesh with 387296 vertices and 383752 faces.
So I'm still not sure if that's a P11 versus P12 thing, or LaFemme versus LaFemme Pro. It might be interesting to try a different figure. And it's probably useful if I add some more diagnostics and make the script bail out right away if the face counts don't match.
I tried with Bella too. Same errors.
adp001 posted Fri, 14 January 2022 at 4:39 PM
I definitely set SubD 2 (also for Bella), and your script tells something about "Subdividing morph for level 3".
odf posted Fri, 14 January 2022 at 4:58 PM
Do any of these have geometry? I tried to export some from P12 and got empty files. Maybe P11 handles them differently.With a loaded LaFemme:
fig=sc.CurrentFigure()
[ac.Name() for ac in fig.Actors() if not ac.IsBodyPart()]
[u'CenterOfMass', u'GoalCenterOfMass', u'rThumb0Con', u'rIndex0Con', u'rMid0Con', u'rPinky0Con', u'rRing0Con', u'lThumb0Con', u'lIndex0Con', u'lMid0Con', u'lRing0Con', u'lPinky0Con', u'UpperBrowCon', u'RightOuterBrowCon', u'LeftOuterBrowCon', u'MidBrowCon', u'RightInnerBrowCon', u'LeftInnerBrowCon', u'RightUpperEyeLidCon', u'LeftUpperEyeLidCon', u'RightLowerEyeLidCon', u'RightUpperCheekCon', u'LeftUpperCheekCon', u'RightLowerCheekCon', u'LeftLowerCheekCon', u'NoseCon', u'RightNostrilCon', u'LeftNostrilCon', u'UpperLipCon', u'RightUpperLipCon', u'RightLipCon', u'LeftUpperLipCon', u'LeftLipCon', u'RightEarCon', u'RightEarLobeCon', u'LeftEarLobeCon', u'LeftEarCon', u'LeftLowerEyeLidCon', u'RightBreastCon', u'LeftBreastCon', u'RightGluteCon', u'LeftGluteCon', u'LowerLipCon 1', u'RightLowerLipCon 1', u'LeftLowerLipCon 1', u'JawCon', u'RightLabiaCon', u'LeftLabiaCon', u'MonsPubisCon']
All this actor are not bodyparts. But they will be exported and interfere a lot when sculping in Blender.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 5:01 PM
At the moment, the script keeps subdividing the base level mesh as long as it has fewer faces than the mesh that was loaded in from file. So if the latter has extra polys that shouldn't be there, an additional subdivision level will result. That's obviously not a very robust strategy, so I'll change it and throw an exception instead.I definitely set SubD 2 (also for Bella), and your script tells something about "Subdividing morph for level 3".
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 5:10 PM
If exporting a clean mesh is a problem, I would certainly be open to using a custom exporter. I haven't had that problem in P12 yet, so I'm a bit confused at what is going on with P11 there.
-- I'm not mad at you, just Westphalian.
adp001 posted Fri, 14 January 2022 at 5:17 PM
Do any of these have geometry? I tried to export some from P12 and got empty files. Maybe P11 handles them differently.
fig=sc.CurrentFigure()
ac=fig.Actor("CenterOfMass")
ac.Geometry().NumVertices()
324
ac.Geometry().NumPolygons()
360
odf posted Fri, 14 January 2022 at 5:18 PM
Hmm, it looks as if my P12 registration number lets me download P11, so I'm doing that now and will see if I can also install and run it. That should be much easier for testing.
-- I'm not mad at you, just Westphalian.
adp001 posted Fri, 14 January 2022 at 5:37 PM
Hold on - seems I had a problem. Testing...
adp001 posted Fri, 14 January 2022 at 5:45 PM
I'm soooo sorry! All good! Works!
Loading mesh from \\VBOXSVR\Poser\Poser\Python\Rendo\pydeltamesh\Data\untitled_blender.obj...
Loaded mesh with 386835 vertices and 383504 faces.
Subdividing for baking...
Subdivided 2 times.
Finding deltas for level 0
Found a total of 24478 deltas for 83 actors.
Subdividing morph for level 1...
Subdividing for baking...
Subdivided 1 times.
Computing vertex normals...
Finding deltas for level 1...
Found 3193 deltas.
Subdividing morph for level 2...
Computing vertex normals...
Finding deltas for level 2...
Found 9408 deltas.
Morph loaded in 23.4530000687 seconds.
Problem was here. I changed Blender from 2.9 to 3. But had not yet adjusted my preferences.
adp001 posted Fri, 14 January 2022 at 5:51 PM
I'm off to bed. Looks like I caught a cold.
Congrats, by the way! Nice work.
odf posted Fri, 14 January 2022 at 5:53 PM
Phew! Such a relief! :thumbsup:Problem was here. I changed Blender from 2.9 to 3. But had not yet adjusted my preferences.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 6:14 PM
I've now also pushed the change from time.process_time() to time.time() that was necessary to make Python 2.7 happy.
Silly mistake of me to use process_time(), seeing as I looked it up in the docs which clearly state that it's a Python 3 feature.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 14 January 2022 at 9:39 PM
Threw an error from 11 to Mubox. Figure is Nova with 2 levels subdivision. Might be a Mudbox thing though, honestly I don't think I've ever morphed in Mudbox without using the PML exporter. I've been using it so long that my memory is vague on that. I'll have to test that particular with a regular FBM without PML exporter. In any case I'll test via Blender 2.92 and report back. Here's the test morph imported into Poser from Mudbox as a prop. Here's the Python Error message.
I do have one question however, you didn't mention whether 'Include Existing Groups in Polygon Groups' was requirement during initial export. In this instance all I checked was 'as Morph Target' as you stated in your instructions...
primorge posted Fri, 14 January 2022 at 10:13 PM
Tested FBM from Mudbox via both unwelded scene export ang geometries import. Without PML the FBMs do not translate working into Poser. Looks like the groups are borked. Single actor morph targets work fine however, no wrong # verts error or winding order explosions.
So for Mudbox it's very likely not your scripts fault. Looks like PML will stay my method for morphing with MB.
For my usage, bear in mind I have Zbrush, this will be useful for Blender SubD morphs only. Going to test Blender now...
odf posted Fri, 14 January 2022 at 10:23 PM
Threw an error from 11 to Mubox. Figure is Nova with 2 levels subdivision. Might be a Mudbox thing though, honestly I don't think I've ever morphed in Mudbox without using the PML exporter. I've been using it so long that my memory is vague on that. I'll have to test that particular with a regular FBM without PML exporter. In any case I'll test via Blender 2.92 and report back. Here's the test morph imported into Poser from Mudbox as a prop. Here's the Python Error message.
That looks like Mudbox might be removing isolated vertices. If I'm right, the second (larger) number in the error message should be the vertex count of the mesh exported from Poser, and the first (smaller) one should be what comes out of Mudbox. Also, both meshes should have the same number of polygons.
The requirement that the isolated vertices must be preserved is a relic from my original command line script. Within Poser I can easily add code that fixes the numbering for that specific case. Unless you're very eager to get the Mudbox test working, I might defer that until after I've finished the post-transform option (i.e. probably tomorrow).
I do have one question however, you didn't mention whether 'Include Existing Groups in Polygon Groups' was requirement during initial export. In this instance all I checked was 'as Morph Target' as you stated in your instructions...
No, that's not a requirement, unless you want access to the group info in your modeler. My script doesn't use the groups.
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 10:31 PM
We shouldn't give up on Mudbox that quickly. Unless PML does something unconceivably clever, if it can deal with your MB outputs, my script should be able to, too, possibly after some small adjustments.Tested FBM from Mudbox via both unwelded scene export ang geometries import. Without PML the FBMs do not translate working into Poser. Looks like the groups are borked. Single actor morph targets work fine however, no wrong # verts error or winding order explosions.
So for Mudbox it's very likely not your scripts fault. Looks like PML will stay my method for morphing with MB.
For my usage, bear in mind I have Zbrush, this will be useful for Blender SubD morphs only. Going to test Blender now...
-- I'm not mad at you, just Westphalian.
odf posted Fri, 14 January 2022 at 10:37 PM
I love how she still looks unmistakably like Nova in that morph.
-- I'm not mad at you, just Westphalian.
primorge posted Fri, 14 January 2022 at 10:58 PM
Tried same with Blender. Received a different error this time. Error screencap, morph prop screencap...
Yes, she's still recognizable. It's the nose and the lips, I didn't morph them at all for these tests.
Anyway. Not having much luck... not blaming your script either. Could be so many things. I did follow the instructions however. I'll try again later. Sorry I couldn't help.
odf posted Fri, 14 January 2022 at 11:34 PM
Not at all! That's very helpful. Sorry about the frustrating experience so far, but every error message makes the script a little better.Anyway. Not having much luck... not blaming your script either. Could be so many things. I did follow the instructions however. I'll try again later. Sorry I couldn't help.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 15 January 2022 at 12:57 AM
End-of-day update: post/pre transform morphs both seem to be working now, and I have two improvements to OBJ loading on my list for tomorrow.
primorge: I think your Blender exports should work if you export with normals and/or UVs, but that would be silly as a restriction, seeing as the script does not use either. Easy to fix, though!
-- I'm not mad at you, just Westphalian.
odf posted Sat, 15 January 2022 at 5:53 PM
So, a new version is up at the same location with the following three changes:
1) There is now an additional dialog that lets one choose between creating a pre- or a post-transform morph.
2) The OBJ file for the morph can now contain "f" lines without slashes, so e.g. Blender exports with neither normals nor UVs should now work. That fixes an oversight of mine that tripped up primorge's Blender test.
3) The input mesh is no longer required to have isolated vertices preserved from the original Poser export. Of course the order of the remaining (non-isolated) vertices still needs to be kept intact. This might fix the problem with the Mudbox export, but without looking at the actual OBJ files that is of course just a guess.
Hopefully this version will prove more successful.
-- I'm not mad at you, just Westphalian.
odf posted Sat, 15 January 2022 at 8:17 PM
Also, I might take a break from this project for a bit, apart from bug fixes and urgent feature requests. I'm pretty happy with how it turned out so far, and it's probably best to let real usage experience drive further developments.
Maybe I'll spend a few days with just sculpting and/or clothes making practice...
-- I'm not mad at you, just Westphalian.
primorge posted Sat, 15 January 2022 at 10:01 PM
odf posted at 5:53 PM Sat, 15 January 2022 - #4433335
Sounds great. I've gone down a rabbithole with painting eye maps so haven't had time to test further, but I have off tomorrow so I'll take a break from texturing and give the new version a spin, the additional tweaks sound great... crossing fingers. If the mudbox export throws an error I'll save the example obj for you to look at via PM link.So, a new version is up at the same location with the following three changes:
1) There is now an additional dialog that lets one choose between creating a pre- or a post-transform morph.
2) The OBJ file for the morph can now contain "f" lines without slashes, so e.g. Blender exports with neither normals nor UVs should now work. That fixes an oversight of mine that tripped up primorge's Blender test.
3) The input mesh is no longer required to have isolated vertices preserved from the original Poser export. Of course the order of the remaining (non-isolated) vertices still needs to be kept intact. This might fix the problem with the Mudbox export, but without looking at the actual OBJ files that is of course just a guess.
Hopefully this version will prove more successful.
Thanks for all your hard work on this.
odf posted Sat, 15 January 2022 at 11:12 PM
If the mudbox export throws an error I'll save the example obj for you to look at via PM link.
That would be great. The original Poser export too, please.
Thanks for all your hard work on this.
Gladly! It's what I do for fun in these parts. :smile:
-- I'm not mad at you, just Westphalian.
odf posted Sun, 16 January 2022 at 5:47 PM
Micro-update: I just checked in an update that makes the logging messages show up right away instead of holding it back until the script has finished. I might add even more messages, just so that it's more apparent that the script is indeed still working.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 5:47 PM
Tried again, latest ver., with MB... got an invalid syntax error.
odf posted Mon, 17 January 2022 at 7:15 PM
primorge posted at 5:47 PM Mon, 17 January 2022 - #4433421
Argh! I forgot that print is not function in Python 2. Sorry! If you feel adventurous, you can just change that line to "log = None" for now (don't change the line indentation, though).Tried again, latest ver., with MB... got an invalid syntax error.
The fixVertexOrder script will have the same problem. I'll have an updated download for both up shortly.
ETA: The fixed version is now up.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 7:52 PM
Cool. Thanks odf... needn't have rushed I'm in the middle of setting up a pose for later use :D... I'll save and test asap.
Thanks again for the timely support... starting to feel like a nag lol.
odf posted Mon, 17 January 2022 at 8:17 PM
Thanks again for the timely support... starting to feel like a nag lol.
It's not like my efforts at learning MD are particularly time-critical. My response times will increase significantly when I'm back at my day job.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 8:28 PM
Bad news. Got this error from both MB and Blender exports...
primorge posted Mon, 17 January 2022 at 8:31 PM
primorge posted Mon, 17 January 2022 at 8:35 PM
I'm curious if anyone else besides me, ADP, and yourself has tested this... With over a thousand views surely someone else can report something, no?
odf posted Mon, 17 January 2022 at 8:48 PM
Well, at least this was one I could not have caught without actually running P11. Try again.Bad news. Got this error from both MB and Blender exports...
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 9:54 PM
Holy Sh*t it effing works! Successful SubD morph generated from MudBox. Dude, that is awesome!
Happy dance.
Congrats ODF. Seriously :)
primorge posted Mon, 17 January 2022 at 10:07 PM
...there is a warning message in there but the morph loaded fine. I'm not seeing anything wrong whatsoever.
primorge posted Mon, 17 January 2022 at 10:11 PM
...and
a successful injection resulting from injection export.
odf posted Mon, 17 January 2022 at 10:19 PM
primorge posted at 10:07 PM Mon, 17 January 2022 - #4433442
Yeah, I'm getting the same warning. Haven't been able to figure it out. There seem to be no zero dividers, infinities or not-a-numbers in sight, but still numpy is not happy. Well, maybe one day I will know......there is a warning message in there but the morph loaded fine. I'm not seeing anything wrong whatsoever.
Also, good thing I spent so much time optimizing. Imagine that number at the end multiplied by seven. Yikes! Only 15 seconds for Antonia at subD 2 on my machine, by the way.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 10:21 PM
primorge posted Mon, 17 January 2022 at 10:26 PM
I had mudbox and photoshop open when I ran the script... plus I'm on a laptop, a 1500 dollar laptop but nonetheless, maybe why mine was so much slower. To me less than a minute is no time at all.
odf posted Mon, 17 January 2022 at 10:35 PM
Yeah, I'm getting the same warning. Haven't been able to figure it out. There seem to be no zero dividers, infinities or not-a-numbers in sight, but still numpy is not happy. Well, maybe one day I will know...
Actually, scratch that. It was just plain and simple division by zero. It's related to the isolated vertices, which get assigned a zero length normal. Anyway, my "fix" for the warning message wasn't one. I had an incorrect mental model of how numpy works. So the next version will get rid of the warning, but it really doesn't matter for the result.
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 10:38 PM
odf posted Mon, 17 January 2022 at 10:53 PM
Now the question I would ask next is why the hell, in all these years since SubD morphs have been a feature in Poser, has no one come up with this solution until now? Bit peculiar I'd say...
I mean, it took me several weeks of full-time work, and there probably aren't that many people who'd do a project like this for funsies in their spare time. Obviously, it would have been much easier and quicker to implement as a build-in Poser feature, but still Bondware would have to pay someone for doing it. Maybe with GoZ and the Morph Tool available, they didn't think it was a big priority.
-- I'm not mad at you, just Westphalian.
odf posted Mon, 17 January 2022 at 10:59 PM
Welcome to the Poser Dev Team lol
You know, I wouldn't mind that if they paid well and gave me a ten year contract so I'd be settled until retirement. LOL
-- I'm not mad at you, just Westphalian.
primorge posted Mon, 17 January 2022 at 11:11 PM
Was tempted to say something about monetary considerations and DS's locked down implementation of SubD morphs but probably that would be pushing the envelope of propriety a bit. Regardless, it's really something that you've opened up HD Morphs for everyone in such a way... I would have a slightly unnerved feeling creeping in the background if I were in your position, especially since the silence is roaring, but I'm teetering on the edge of tin foil hat donning ;)
Congratulations again, and sincere thanks from this Poser user.
odf posted Tue, 18 January 2022 at 12:24 AM
Ha, I'm expecting to find three P4 males in badly draped black suits randomly getting out of a blocky limousine in front of my building any day now. :grin:
-- I'm not mad at you, just Westphalian.
adp001 posted Tue, 18 January 2022 at 8:51 AM
@primorge: You would have to strain the protection of IP quite a bit if you wanted to accuse odf's script of anything.
The script is not based on reengineering. It was enough to know that Poser's algorithm is based on OpenSubDiv (like most software that uses subD), and that both the source code and detailed descriptions of it are freely available on the net.
The fact that nobody has done such a script before has three main reasons: First, nobody with a plan in the Poser universe how Subdivision works; Second, due to other existing tools not worth the effort; Third, because for the reconstruction of the HD morphs more is needed than just knowing how to program with Python - namely exactly what odf's field of expertise is and for which one needs an innate talent besides solid math knowledge.
For me, there's also the fact that I don't use HD morphs in Poser. Where details are required, bump/normal and displacement maps are completely sufficient for me. This is probably also because I need to be able to animate my figures.
primorge posted Tue, 18 January 2022 at 1:27 PM
@primorge: You would have to strain the protection of IP quite a bit if you wanted to accuse odf's script of anything.
I was thinking more along the lines of less talented shady characters appropriating his script for personal financial gain... given odf's track record he seems to circumvent this by making everything open source with credit. So... moot.
For me, there's also the fact that I don't use HD morphs in Poser. Where details are required, bump/normal and displacement maps are completely sufficient for me. This is probably also because I need to be able to animate my figures.
I agree. Let me clarify though... from a personal perspective workflow, if I'm going to be doing a final showpiece render I'll often morph and pose the figures as close as possible to my final vision and weld everything into static props. Clothes, hair, figures. I'll then do a higher resolution beauty pass sculpt of those props externally for the final output. I'm good at asymmetrical sculpting from my practice in traditional sculpture in clay. It's then just a matter of rendering. Sounds like a lot of work I know but it's a method I've gotten used to. Being that I prefer to render with Firefly, because of its flexibility and, yes, habit and ingrained workflow, displacement maps are also very useful. There's no denying that Displacement is also resource economical.
The fact that we now have LOD morphs, which, thanks to ODF, can be created in virtually any sculpting software, can be previewed real time and also mixed in real time via dials to arrive at results that aren't specifically planned (eg. Improv) is just additional flexibility in such a workflow. It's gravy. Besides the mixing and realtime aspects of HD morphs there's also the cool option to release sets of such morphs via pmd INJ pose very easily. I'll be honest I'm not a big fan of Zbrush's GUI (which is ironic considering it's one of the first 3d softwares I started with back in version 2), but really like some of it's plug ins, and much prefer to sculpt in MudBox or Blender because of the more "conventional" interface. Never thought I'd say that about Blender ;)...
Anyway, just wanted to give a glimpse into my perspective on this in light of your comments, for clarification.
odf posted Tue, 18 January 2022 at 7:51 PM
The fact that we now have LOD morphs, which, thanks to ODF, can be created in virtually any sculpting software, can be previewed real time and also mixed in real time via dials to arrive at results that aren't specifically planned (eg. Improv) is just additional flexibility in such a workflow. It's gravy.
Exactly how I see it.
I thought maybe I'd summarize the method I ended up with for making the morph, just in case someone's interested.
1) Starting with preparations: in addition to loading the hi-res OBJ file with the morph, I grab two "base" meshes from Poser, both at subD 0, an unwelded one with all the relevant actors and a welded one with isolated vertices intact, so that the numbering between the two meshes matches. Then I do some renumbering on the hi-res mesh to ensure it has the isolated vertices at the same indices as the welded base mesh.
2) Next I bake down the morph to subD 0. Let's say the morphed hi-res mesh is at subD level N, then I subdivide the welded base mesh N times and compare the positions of the subD 0 vertices in the two subD N meshes I now have. These differences provide my level 0 deltas.
3) The next step is just a regular FBM construction using the deltas from step 2. I get the actor information from the unwelded base mesh. That's not the optimal way of doing it, but since I started with a command line script that uses an actual mesh loaded from an OBJ file to get actor information from, it was the most straightforward way.
(... and my lunch just arrived, so I'll leave step 4 - the actual subd part of the process - as a cliffhanger.)
-- I'm not mad at you, just Westphalian.
odf posted Tue, 18 January 2022 at 8:45 PM
Okay, I have achieved nourishment. Onward.
4) In order to get higher level deltas, I first apply my subD 0 deltas to the welded base mesh and subdivide it once. That gives me a partially morphed subD 1 mesh. I repeat the baking down process from step 2 with that new mesh as my base mesh to obtain "raw" level 1 deltas. From these I construct proper, but sadly only approximate level 1 deltas by taking only the displacement along the surface normal into account. Applying the "corrected" deltas to my partially morphed subD 1 mesh gives me a fully morphed subD 1 mesh, and I repeat the process to construct level 2 deltas and so forth until I've reached level N.
-- I'm not mad at you, just Westphalian.
odf posted Tue, 18 January 2022 at 9:03 PM
Some notes:
- Maybe it's good to be aware that the subD 0 deltas live in the body part actors, whereas all higher level deltas live in the BODY actor. According to some highly unscientific experiments I did, higher level deltas in body parts as well as subD 0 deltas in the BODY seem to be ignored. This makes sense to me since subdivision as such requires a unimesh figure, but who knows if there isn't some secret trick to stuff a, say, subD 1 morph in the head actor.
- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations.
- While writing down the steps, I remembered that there is a different, possibly better "baking down" method that I had meant to try, but completely forgot about. I don't want to get into too much detail about it, but the general idea is that one can reverse the subdivision process to produce a lower-res mesh from a higher-res one. I have an old Scala implementation of that which I'd used to "downsample" Antonia morphs, but haven't gotten around to porting it to Python yet.
-- I'm not mad at you, just Westphalian.
primorge posted Wed, 19 January 2022 at 8:44 PM
"- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations."
I'll do some comparison examinations at close scrutiny and tell you if I notice any significant differences between the sculpt and the morph output, I'll PM you if I see anything unusual. Forum interactions give me generally negative vibes.
odf posted Wed, 19 January 2022 at 9:48 PM
"- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations."
I'll do some comparison examinations at close scrutiny and tell you if I notice any significant differences between the sculpt and the morph output, I'll PM you if I see anything unusual. Forum interactions give me generally negative vibes.
That would be very helpful. I can't promise to fix whichever issues you may find, but knowing what can go wrong will certainly help with the motivation to look deeper into this. My feeling is that things like finer skin folds will be among the features most likely to be a little off. Or any fine details that don't displace along the surface normal, really.
-- I'm not mad at you, just Westphalian.
primorge posted Wed, 19 January 2022 at 11:22 PM
"That would be very helpful. I can't promise to fix whichever issues you may find, but knowing what can go wrong will certainly help with the motivation to look deeper into this. My feeling is that things like finer skin folds will be among the features most likely to be a little off. Or any fine details that don't displace along the surface normal, really."
I see. You did have me concerned for a second. I don't foresee ever using Poser subdivision beyond 3 levels so I probably won't encounter such miniscule differences. I think it's best, or most practical live in Poser, for "gross" anatomical details such as boney protusions and musculature details, large veins perhaps, etc. Everything else seems like the practical realm of various height maps. Glad you clarified that. Your MD efforts look good btw...
odf posted Thu, 03 February 2022 at 12:59 AM
As an aside, even though it's not advertised as such, the script can also load regular FBMs from .obj files. I just had a case where Poser's "Load FBM from file" produced an exploding mesh. Probably caused by isolated vertices missing or some such, but then my script loaded the morph perfectly.
-- I'm not mad at you, just Westphalian.