Forum Moderators: Staff
Poser Technical F.A.Q (Last Updated: 2024 Nov 13 12:50 am)
Welcome to the Poser Technical Forum.
Where computer nerds can Pull out their slide rules and not get laughed at. Pocket protectors are not required. ;-)
This is the place you come to ask questions and share new ideas about using the internal file structure of Poser to push the program past it's normal limits.
New users are encouraged to read the FAQ sections here and on the Poser forum before asking questions.
I think it should be a combination of basemorph (SubD-0) and displacement/normal map. The morph takes care of the vertex shifts mainly in X/Y and the displacement/normal map takes care of height and depth. The intensity of the displacement map can be coupled to the morph. The result should be similar to a SubD morph.
From what I've seen, that seems to be a fairly common strategy in other 3d software. In Blender I think I know how to save the base morph into an OBJ and bake the higher levels down into a displacement map. So the missing part would be wiring this up in Poser. Could potentially be automated via Poser Python.
-- I'm not mad at you, just Westphalian.
"Interesting! I would have guessed that the weight maps would be the hardest part to get to match. Shows how much I know."
The weightmaps usually just need bit of resmoothing. If you go from a high-res figure or bodypart to a low-res one, most often no adjustment is necessary at all.
*
Hmm, one crazy idea:
Would it be possible to transfer a morph made on a high-res object over to a subdivided low-res object?
Like:
I make a morph with the morphbrush on standard resolution V4.(Which is basically a "frozen" subdivision of V4 17K)
I have done so in the past, and the Morphbrush can easily handle standard V4 or V3.
*
Then one would have to transfer that (high-res) morph over to a subdivided V4 17K. (Which basically is the same mesh, just subdivided once.)
Wouldn't that make the creation part of the morph much easier?
I know it's a "detour", but I think the "on the fly" subdivision, especially if it's more than once, is what's slowing the Morphbrush down, not the absolute polygon count of the figure itself. (Within reasonable limits, of course)
Just thinking out loud, so please bear with me. :-)
Would it be possible to transfer a morph made on a high-res object over to a subdivided low-res object?
You mean inside Poser? That would be very neat. If you find a way to reliably do that, I can throw my script in the recycle bin.
I know it's a "detour", but I think the "on the fly" subdivision, especially if it's more than once, is what's slowing the Morphbrush down, not the absolute polygon count of the figure itself. (Within reasonable limits, of course)
I always thought it was just the pure polygon count. If the morph brush is faster on plain meshes than subdivided ones, that would be useful to know.
I'd still prefer to use an external tool for bigger morph jobs, but even then, if I could load the morph the normal way onto a higher res mesh and transfer to the lower res one as a subdivision morph, that would potentially be easier than having to go through the script.
-- I'm not mad at you, just Westphalian.
Content Advisory! This message contains nudity
"I always thought it was just the pure polygon count. If the morph brush is faster on plain meshes than subdivided ones, that would be useful to know."
I did a little test with my V4 "Zombie" morph.
V4 has 76181 polygons.
Morphing her "as is" with the Morphbrush is not a big problem.
But at SubD 1, everything turns very slow.
I exported her then at SubD 1 and imported her back as a welded prop which now had 264096 polygons.
Morphing that prop wasn't as effortless as morphing the unsubdivided V4 figure of course, but still noticeable easier than morphing the subdivided V4. Much less lag while brushing, less jerkyness, better smoothing.
Sorry, I can't help with the implementation. I don't speak "code". Just some practical observations here. :-)
But Poser already CAN transfer morphs between similar shaped figures that have wildy different mesh topology, and it can export a subdivided mesh as a prop. (You can for example transfer M2's muscle morphs over to M3, and if you use the M3toM2 figure, which features the M2 shape [But not the actual mesh], then the transfer is almost perfect)
So it seems that there is the capeability to "freeze" the subdivision inside Poser.
And then the "high res" morph might somehow be able to be transferred from the subdivided prop to the subdivided figure it was originally derived from.
If one could really manage to transfer a hi-res morph from a prop to a subdivided figure, this means that unecessary bodyparts could be left out.
In my example, I would only export the torso as a prop to work on the ribcage, freeing even more computer power for operating the morphbrush.
And then the "high res" morph might somehow be able to be transferred from the subdivided prop to the subdivided figure it was originally derived from.
That's the crucial part. I know that Poser can transfer morphs between meshes/figures. If that also worked from a regular morph onto a subdivision morph, that would be perfect (assuming the quality is reliably good). I haven't heard of anyone doing that successfully, but that doesn't mean much.
JoePublic posted at 5:41 PM Sun, 26 December 2021 - #4432447BTW, don't want to hijack the thread so I'll crawl back into my little hole in the wall.
Just thought there might be a way to circumvent the problem. ;-)
No worries! The delta mangling necessary for subd morphs is messy enough to make circumvention an increasingly attractive option.
Seriously, there's a reason this is in Poser Technical, not Poser Python. I just want a way to get subdivision morphs into Poser, I don't really care how.-- I'm not mad at you, just Westphalian.
You probably both already know that...
If you import a figure's mesh as a grouped prop, create and load your morphs on that, and have an externally subdivided version in scene... you can export the base resolution version with the morph applied, externally subdivide it, and apply the morph to the higher res version prop in scene without winding order error. Not sure what use it would be other than transferring morphs between LOD versions of a figure, but just noting.
JoePublic are you saying that you are exporting Poser Subdivided props and not generating a 0 kb OBJ? I've only ever, I think, accidentally, exported from subdivided figure and it produces an empty OBJ file. Never tried a prop, I just assumed Poser's subdivision was strictly "in scene". Interesting.
Ok. Just ran a test
Loaded La Femme. Unimesh, 1 level of Subdivision. Exported neck as OBJ. Resulting OBJ 0 kb.
Loaded Antonia 1.2. Switched to Unimesh Skinning, 1 level Subdivision. Exported neck as OBJ. Resulting OBJ 0 kb.
Are you saying you can export OBJ body parts, from unimesh skinning, in Poser 12, and return a valid OBJ?
Ok. Bit of a non control there. Overwriting the same file with exports... cursory hover of file showing 0 kb. Opening the file up in text editor shows...
Actual results. Exporting a single body part produces a 0 kb OBJ. Makes sense considering "unimesh".
Exporting full figure grouped, 1 level, working subdivided file.
No groups, same result.
Anyway. Good to know. Full figure subdivided works. Who knows how long I would of just assumed it was only in scene.
I just made a test with David 3 (Unimesh rigging):
Poser 11 Pro can export subdivided figures.
Poser 11 Pro can export subdivided props.
Poser 11 Pro can not export subdivided bodyparts.
(But if you spawn a prop from a bodypart and subdivide it then, it can be exported like any other subdivided prop.)
Hope this helps. :-)
JoePublic posted at 5:47 PM Mon, 27 December 2021 - #4432497
👍I just made a test with David 3 (Unimesh rigging):
Poser 11 Pro can export subdivided figures.
Poser 11 Pro can export subdivided props.
Poser 11 Pro can not export subdivided bodyparts.
(But if you spawn a prop from a bodypart and subdivide it then, it can be exported like any other subdivided prop.)
Hope this helps. :-)
Probably not relevant but interesting new fact to me; if I export La Femme subdivided with include existing groups checked and import it into mudbox it loads a whole mesh obj intact with body part groups as selection sets. I can sculpt on the mesh across body parts (it's not unwelded).
However if I reimport said sculpt full body as load FBM it returns a wrong number vertices error.
Inspection of the exported mesh shows the groups were dumped.
If, after sculpting, I select an individual selection set (group: say hip in this instance) and Export from selection from mudbox and try to load as morph target on the hip, it returns a wrong number of vertices error. Inspection of the export obj in scene appears to be proper number of verts for subd level 1 hip. By comparison I spawn a prop from La Femme's hip, subdivide 1, and compare to mudbox export. Identical vert number.
I'm not sure what this indicates.
Probably not relevant but interesting new fact to me; if I export La Femme subdivided with include existing groups checked and import it into mudbox it loads a whole mesh obj intact with body part groups as selection sets. I can sculpt on the mesh across body parts (it's not unwelded).
However if I reimport said sculpt full body as load FBM it returns a wrong number vertices error.
Interesting! I managed to load a subdivided and morphed mesh onto an equally subdivided prop via "Load Morph Target..." and got all excited, only to discover that the higher subdivision levels were completely ignored. Only the base level was changed by the morph.
So I stopped trying at that point, although I imagine there are workflows where this kind of thing could still be useful.
-- I'm not mad at you, just Westphalian.
I am now back to working on the script and think I have a good plan for figuring out the u- and v- directions. It will not be exactly what Poser uses (because I still haven't been gifted the source code ), but that should be okay as long as the unmorphed mesh is reasonably smooth after subdividing, which is what one would expect.
I've also experimented with tricking the Poser Morph Tool into baking down the morph to the lower subdivision levels. I opened the morph for editing, checked "Bake down for subdivision", set the radius and strength to the minimum non-zero values, found a vertex and dragged at it a tiny bit. The morph tool seems to do the baking down for the full mesh after every individual edit, which might explain it's painful slowness, but is to our advantage here because after this little stunt the complete morph has been baked down.
I'll experiment with this some more to see if there are easier ways of triggering the baking. But honestly I'm really quite happy that this worked because doing the baking down in the script is looking increasingly impractical.
-- I'm not mad at you, just Westphalian.
Poser uses Pixar's OpenSubDiv. The source including demo code is available on GitHub: https://github.com/PixarAnimationStudios/OpenSubdiv
The detailed description and very detailed descriptions can be found here:https://graphics.pixar.com/opensubdiv/docs/intro.html
(By the way, the documentation is not only interesting for programmers - you can find interesting facts about SubD and what you should pay attention to when using it)
Could it be that the mysterious data has nothing to do with normals or UV, but somehow describes Bezier curves?
@odf: I think there is helpful information here: https://graphics.pixar.com/opensubdiv/docs/far_overview.html#far-patchtable
adp001 posted at 5:50 AM Tue, 28 December 2021 - #4432513
Thanks! Yes, that looks relevant. Let’s hope that all that weirdness I’m seeing is really just OpenSubdiv storing stuff efficiently and not some proprietary Poser magic.@odf: I think there is helpful information here: https://graphics.pixar.com/opensubdiv/docs/far_overview.html#far-patchtable
By the way, I’m getting closer with the deltas. But they still get a bit crumbly where the mesh is dense (for example around the nipples) and texture seams are messed up. Hopefully the Pixar docs can tell me what I’m doing wrong there.
-- I'm not mad at you, just Westphalian.
Some more delta experiments! It's better in motion, but I couldn't be bothered to make an animation. Antonia's back at subd level 1 with a morph that has the delta [0.0, 0.01, 0.0] at every vertex. The left image shows the morph dialed at 0.0, right one at 0.1.
The tile pattern is wider than high. If you look closely, you can see the vertices move along the positive u direction. The interesting stuff happens at the texture seams and also at irregularities in the uv maps. At the seams, we see that one side wins, but it's not obvious how that side is picked. At the vertical seam down the back, it's consistently the right side. At the horizontal seam, it's sometimes the top and sometimes the bottom.
Spots with UV irregularities are also interesting. It seems to texture quads are split into pairs of triangles and then the uv coordinates interpolated for each triangle individually. As there are two ways to split each quad, there's another opportunity to "disagree" with Poser and mess things up.
-- I'm not mad at you, just Westphalian.
Apparently he no longer has an account here but here are links to relevant threads that I could find. Must have been a tumultuous time in the Poserverse...
Breadcrumb trail...
https://www.renderosity.com/forums/threads/2855291/print
https://www.renderosity.com/forums/threads/2855662
There seems to be a theme throughout history where maverick idea-men operating alone are ground 'neath the wheels of uncaring machines, or their ideas appropriated by collective entities and their names becoming vaguely remembered footnotes... in less dramatic language if you prefer... a sort of Mandela effect ;)
The trouble is, if you switch off the hardness and then want to subdivide again, you get drift again. Also, Poser does not always seem to smooth out curves formed by hard edges correctly. I've noticed that with the irises on low-res Antonia. Using two or three edge loops close together seems to be a more robust solution.Just switch off the hardness after the subdivision...
-- I'm not mad at you, just Westphalian.
By the way, those funny texture artifacts in the picture above happened in (Poser's) subdivision. Here's the same tile pattern on the base mesh:
I'd have to double-check the subdivided UVs to see why, but I've noticed recently that Poser sometimes spontaneously "unwelds" random UV vertices. It also uses Catmull-Clark interpolation of UV positions, i.e. shifts around the UVs of the original vertices, which together with the unwelding leads to chaos. So all as usual in Poserland.
-- I'm not mad at you, just Westphalian.
Well...
I'm not going to bash your figure's eye design any more than I already have lol. It was a different age. How could you have predicted?
It's pretty bash-worthy. To my defense, I had been thinking that I would eventually just stick Bagginsbill's eyes into Antonia. But he never said the three magic words I was waiting for: "They are done."
-- I'm not mad at you, just Westphalian.
Here's a question for @adp001 or another Poser Python sorcerer: if I pull this off, it would be nice to eventually run the code from within Poser, possibly as part of the OBJ_ImExport script. The problem is that right now there seems to be no way to directly write subdivision deltas via Poser Python. So I imagine the script would have to write a PMD and associated PZ2 "by hand" and then (ideally) load and "execute" them in order to create the morph in the current session.
So are these things that one can do from within Poser, namely a) create a file with arbitrary content and then b) load that file back into Poser as if the user had opened it from the pose library?
-- I'm not mad at you, just Westphalian.
Apologies for the interruption odf, will repost your python query after this post.
I recall that I had rebuilt Antonia's eyes; based off the original meshes but retopo'd and welded/ constructed. Eye cover and cornea all one uncut piece. Iris, pupil, sclera all one uncut piece. Material set ups relatively the same, same naming however, though I did take a small liberty with the iris sclera boundry. I had initially wanted to remap the eyes in more of a claw Gen4 cut shell style but that would make Antonia's old maps useless, opted instead to match the cutting and placement of shells to match old UV style, very simple cuts and a project by normal unwrap as pre existing. Just need to do some shuffling with the shells and the eyes will be done. They will morph and subdivide excellently. Figure sometime this weekend I'll upload and post a link here. I'll include properly grouped/r-l named props and obj files so you can just load them and parent to existing actors. In the middle of some match centers/morph/"superConformer" stuff for Nova but I have off the next couple days so figure the link probably sometime Sunday night depending on how fiddly testing/matching shell position and scaling is with the UVs...
Here's a question for @adp001 or another Poser Python sorcerer: if I pull this off, it would be nice to eventually run the code from within Poser, possibly as part of the OBJ_ImExport script. The problem is that right now there seems to be no way to directly write subdivision deltas via Poser Python. So I imagine the script would have to write a PMD and associated PZ2 "by hand" and then (ideally) load and "execute" them in order to create the morph in the current session.
So are these things that one can do from within Poser, namely a) create a file with arbitrary content and then b) load that file back into Poser as if the user had opened it from the pose library?
adp001 posted at 11:23 AM Thu, 30 December 2021 - #4432604
Okay, general disclaimer: I was jumping way ahead with my question about Poser integration. I like to have a general idea where the journey is going, but I'm still doing my steps one at a time. It's just good to know that in principle the script could be made "fully automatic."@odf: Can you please describe how to create a hires morph manually with your script?
What happens in case of errors? Are there error codes as return? Does your script know parameters to control anything?
Does your script run under Poser or does a Python interpreter have to be present in the system?
So currently my script does not run under Poser. It's a simple command line script that at the moment takes two OBJ files, one exported from Poser at the desired subdivision level, the other the output of sculpting the mesh in the external software. It also takes the subdivision level and the name of the morph to produce as options. The output are a pz2 and a pmd file.
To be clear, I can load the morph that this script produces into Poser, but it generally has some more or less bad artifacts, so won't likely be usable.
The next step will be to bake down the morph to base level and (for now) restrict the higher subdivision levels to displacement along the normals as we talked about before. In order to do that, instead of the hi-res Poser output I will instead need two full-figure OBJs exported from Poser at subd-0, one welded and one unwelded. Awkward, but I'll deal with fixing awkward (and with Poser integration) once it looks like the script is producing something useful. Same with error handling, thinking about what other parameters could be useful, speed etc.
-- I'm not mad at you, just Westphalian.
The next step will be to bake down the morph to base level and (for now) restrict the higher subdivision levels to displacement along the normals as we talked about before. In order to do that, instead of the hi-res Poser output I will instead need two full-figure OBJs exported from Poser at subd-0, one welded and one unwelded.
Just a quick note on why I need three OBJs here. Poser defines the base (subd-0) level of a morph with individual sets of deltas per body part actor, but then for each higher level it's just one set of subd-deltas on the unified body actor. It also seems to export a subdivided figure mesh without body part grouping (although I might try some more export options to see if I can't coerce it into including the body part names). So the unwelded subd-0 mesh is what I want for the individual morph targets on the body parts, whereas the welded, subdivided and morphed mesh is good for making the hi-res deltas. The welded subd-0 mesh is my Rosetta Stone that helps me translate between the two (which I need for baking down and interpolating). It has the grouping that tells me where the body parts go, but it's also a unified mesh that I can match up with the hi-res OBJ.
-- I'm not mad at you, just Westphalian.
PS: Instead of exporting the welded subd-0 mesh, I could in principle also use a small Poser Python script that derives the mapping between vertex numbers in the unwelded and welded mesh (there's a function in OBJ_ImExport for that that I could borrow) and writes that out into a file.
But unless I get stuck with my implementation, I'd rather not do that because having to use two script, one inside and one outside of Poser, just makes things even more awkward I feel.
-- I'm not mad at you, just Westphalian.
So far, it all sounds quite complicated to use. I'm not sure if the few people who want to do hires morphs in Poser themselves wouldn't rather spend the 10 dollars a month (or 180 for perpetual) for ZBrush Core.
But what the heck: programmers just love it the hard way :)
I wouldn't include the whole thing in another script that doesn't directly fit the theme (hires morphs). Rather I would try to build an interface that hides the quite complex and likely error-prone process. The UI can export the figure from Poser in two or three copies as desired (including the vertex crosslist if necessary) and then feed your script with the filenames and parameters. Then the UI can take the sculpted result and assign it to the correct figure.
Each step can be accompanied with appropriate text or images in the UI.
The user just has to remember the export directory and put the result from Blender there.
Biggest hurdle for the user could be: Installing the Python interpreter with all the necessary libs....
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Interesting! I would have guessed that the weight maps would be the hardest part to get to match. Shows how much I know.
Anyway, a tool for partially subdividing a figure while keeping the UVs matched up properly also sounds like an interesting project. I'm working on some subd code in Python right now. (I've implemented Catmull-Clarke style subd in Java, JavsScript, Elm and Scala so far, so Python was long overdue. ) In principle, it should be easy to adjust the interpolation at the boundary of the torso so that it matches up with the non-subdivided parts. But usually the devil is in the details, so I wouldn't promise anything.
-- I'm not mad at you, just Westphalian.