Forum Moderators: Staff
Poser Technical F.A.Q (Last Updated: 2024 Nov 13 12:50 am)
Welcome to the Poser Technical Forum.
Where computer nerds can Pull out their slide rules and not get laughed at. Pocket protectors are not required. ;-)
This is the place you come to ask questions and share new ideas about using the internal file structure of Poser to push the program past it's normal limits.
New users are encouraged to read the FAQ sections here and on the Poser forum before asking questions.
PS: The only reason for splitting the script into an in-Poser UI and an external command line part would be if "someone" were really keen on starting on the UI right away. Of course I could not hold that imaginary person back, but I'd have to point out again that there's still a pretty high probability that the whole project will flop. And at any rate, the end goal would then be to bring the external code "home" into Poser and get rid of any temporary files with data we can easily recreate via the Poser API.
-- I'm not mad at you, just Westphalian.
Maybe I should sleep more after all. Do not know what I read that sounded complicated :)
Probably the way I wrote it.
Ok, so all no problem. So a little UI with WxPython I do on a boring weekend in between. No reason to work something in advance. Just give me a holler when it's time.
-- I'm not mad at you, just Westphalian.
[...] It also seems to export a subdivided figure mesh without body part grouping (although I might try some more export options to see if I can't coerce it into including the body part names).
Just a quick update for reference: it turns out that the "include existing groups in polygon groups" option *does* produces an OBJ with body part names in the groups, even for a subdivided unimesh figure.
-- I'm not mad at you, just Westphalian.
Another interesting tidbit I just noticed, slightly annoying but no show-stopper: when I export a welded figure mesh from P12 at base resolution, the dead vertices left over from welding are removed. But when I export at subdivision level 1, they are preserved. So the numbering of the base level vertices in both meshes doesn't quite match up.
-- I'm not mad at you, just Westphalian.
odf posted at 4:06 PM Fri, 31 December 2021 - #4432671
primorge posted at 3:50 PM Fri, 31 December 2021 - #4432665Slotho Exchango Loader lol.That would be an extremely ironic name. Due to be written in pure Python and handling large meshes, it will be painfully slow.Presto Morph Express
Sorry, going back to work now.
Post transform is like default subtraction into a difference final result only automated. I think ADP's script does this. As does PML and Goz. Also is built into Poser, but in a confusingly worded or implemented way.
Back in the day I would do it through multiple morph dials. For instance I want to fix something inside a figure's mouth. It would be easier to create the fix morph if the mouth is open. Problem is by convention the open mouth position will obviously be baked into the export, thus becoming a default state. After morphing over this new default state any pre-existing morphs (mouth open in this example) will be added to the final resulting import target causing telescoping. Not pretty.
So you would think, ok, I'll just dial the mouth open to 0 and apply the target. Mouth pops open again after the target is applied. Ok, so I dial the original mouth open to -1 and export the result as a morph target (or spawn in scene if you prefer, I like to do these things via export, less likely for me to botch something). Apply the new morph, and there's your difference as a working dial. Same can be done with rotations, baking as morph and subtracting... voila JCMs. Supposedly, in theory, I didn't really start messing with JCMs until post transform became an automation rather than a tedious dial process. Overlaying difference morphs, all the time.
One final observation on your comment about Python speed... I've noticed that PML takes a bit to toggle through all the actors in a source figure, noting deviations in specific actors. Sometimes very slow if you haven't shut down and restarted Poser. This is with default resolution figures. GoZ is pretty fast with SubD figures, so I imagine GoZ is not Python. Probably seems like a doofus observation from a programmer perspective but I'll risk mentioning it.
Yeah, pretty sure ADP's script supports post-transform, so I can study his code and/or bother him with questions. I definitely want to be able to do hi-res JCMs.
It should only matter for the base resolution part of the morph, though. As mentioned at length, the higher level details are expressed relative to the surface directions, so I'd expect them to be completely agnostic to the pre/post transform thing.
By the way, I've got a slightly revised devious plan: since there's a fair deal of busy work involved in the base resolution part of the morph (which as it seems I can't skip, at least not at this stage), and that part is indistinguishable from a regular FBM, I think I'll first write a command line script that converts an OBJ into an FBM, then integrate that into Poser with a reasonably friendly GUI, and finally add on the hi-res stuff.
Also by the way, some of the slow bits (such as parsing and composing OBJ files in Python) should go away when the script runs under Poser. Other slow bits will still be slow, but some of them (like the vertex order repair function) can probably be made optional. There's some number crunching remaining for the hi-res stuff that needs to be done and probably can't be delegated to Poser. So that's most likely what will slow the script down. Might be fun to find some trickery to still make it fast, but I'll leave that for later.
-- I'm not mad at you, just Westphalian.
primorge posted at 5:29 PM Fri, 31 December 2021 - #4432679
One can write very fast Python scripts if all the heavy number crunching happens in a compiled library (usually C/C++ or such). For all I know, GoZ could be mostly Python with just a little bit of native code.One final observation on your comment about Python speed... I've noticed that PML takes a bit to toggle through all the actors in a source figure, noting deviations in specific actors. Sometimes very slow if you haven't shut down and restarted Poser. This is with default resolution figures. GoZ is pretty fast with SubD figures, so I imagine GoZ is not Python. Probably seems like a doofus observation from a programmer perspective but I'll risk mentioning it.
Inside Python we can also use Numpy which does number crunching in bulk. It's great for things like adding up large lists of numbers, but it actually slows things down when instead it's fed many small lists of numbers. So the trick then is to organize the computation so that it plays to Numpy's strengths.
-- I'm not mad at you, just Westphalian.
Finally there's Numba which can turn regular Python code into machine code *while* the script is running (some call this just-in-time or JIT compilation). Like Numpy, it needs the code to be written in special ways to be effective, but those special ways are much easier to achieve than Numpy's special ways. It's the best thing since sliced bread, but not included in Poser, so can't be relied on it for this project.
-- I'm not mad at you, just Westphalian.
Are you sure you want groups for full body morphs? I for sure don't.The reason: Sculpting with a brush is impossible if the model is not welded.
A full body morph is nothing else than what my script does in the end.There is a "master dial" in the body that triggers all morphs in the individual parts (that's basically the whole description for "FBM"). If you first save a figure in the default pose with my script, then use that in Blender to sculpt and let the script load the result, you get a typical FBM.
In the parts where nothing was changed, no morph is created at all. This works reliably.
About the speed of Python: My script doesn't do any great mathematical stunts, but saves and loads OBJ files via Python (without Poser) and does a few things with the morph data. Also without Posers help beside the Python API.ย After the first time (the image files are copied to the project directory if desired so that the model doesn't look so naked in Blender :) ) saving and loading is very fast on my machines. On my oldest laptop with an old I5 Dual-Core, 8 Gig memory and SSD it is still reasonably fast. In Poser clicking the scripts "save" button, then switch to Blender and load OBJ goes quickly without a coffee break. And vice versa.
In Python, even more sophisticated things can be surprisingly fast in my opinion: I made a script some month ago to rotate and move morphs in Poser. In "real time". A cube is loaded, which can be moved with the mouse. The morph follows the movements reliably. I used Numpy for this. The script spends most of its time doing callbacks when you move a dial or the cube.
My "trick" when loading OBJ files: react only on "v" lines and abort as soon as something else comes. Because for morphs you only need the vertices, nothing else. Sure: It can happen that someone creates an OBJ file with a modeler that doesn't output the data sequentially. But this is quite rare (nowadays). To slow down the majority of users because of that I think is nonsensical. I'm happy to send users who are affected a version that works for them - which is then unfortunately correspondingly slower.
adp001 posted at 10:46 AM Sat, 1 January 2022 - #4432712
Shrug. In Poser parlance apparently Transform is a translation, rotation, or scale. As in when you save a pose the option to save Body transformations. I seem to recall someone mentioning that it also pertains to the position of morph calculation as written into the file, as in how JCMs are written, don't hold me on that one though. I don't see why your interpretation would be wrong, makes sense in an encompassing kinda way."Post Transform":
ย ย Change = (Changed data set - Original data set)Or in poser language:
ย ย ย Morph = (Sculpted figure - Original figure)Or also:
ย ย delta_array == modified_vertex_array - original_vertex_array.
Each morph is a "post transformation" - or have I misunderstood something again?
Shrug. In Poser parlance apparently Transform is a translation, rotation, or scale. As in when you save a pose the option to save Body transformations. I seem to recall someone mentioning that it also pertains to the position of morph calculation as written into the file, as in how JCMs are written, don't hold me on that one though. I don't see why your interpretation would be wrong, makes sense in an encompassing kinda way.
Thanks for the manual excerpt! I'm used to a wild mixture of "official" and community parlance from back when, so it's good to know that this is canon. I'll check out what the conversion does.
-- I'm not mad at you, just Westphalian.
Are you sure you want groups for full body morphs? I for sure don't.The reason: Sculpting with a brush is impossible if the model is not welded.
I'm not quite sure what you're getting at. A welded model can still have groups. Also, I wouldn't make a morph loader that required groups in every case. This is just a stepping stone.
A full body morph is nothing else than what my script does in the end.There is a "master dial" in the body that triggers all morphs in the individual parts (that's basically the whole description for "FBM"). If you first save a figure in the default pose with my script, then use that in Blender to sculpt and let the script load the result, you get a typical FBM.
Yes thanks, that's what I meant when I said my script made an FBM. What did you think I meant?
In the parts where nothing was changed, no morph is created at all. This works reliably.
And if an FBM was all I wanted, I would simply use you script or Poser's "Load full body morph..."
My "trick" when loading OBJ files: react only on "v" lines and abort as soon as something else comes. Because for morphs you only need the vertices, nothing else.
That's a good idea! I agree that for the "end product" the fancy options should not make things slow by default.
I don't think I want to discuss Python speed any more until I have a working prototype. Make it work, then make it fast. Happy to talk optimization strategies then.
-- I'm not mad at you, just Westphalian.
PS: To clarify, the purpose of step 1 was to make sure I know how to make an FBM in the form of a PZ2 and PMD that Poser can load correctly. That's the whole point. The rest was just my undisciplined brain getting excited about cool unrelated stuff I can do. I'd promise not to do it again, but I'd have to break that promise.
-- I'm not mad at you, just Westphalian.
Content Advisory! This message contains nudity
Another baby step done, as seen below. I can now take a higher resolution morph (left, OBJ exported from Poser at subd 2, sculpted in Blender and returned into Poser as a prop) and bake it down into a base-resolution FBM (right, applied to Antonia and shown at subd 2). Obviously the high-frequency details are lost so the remaining step will be to recover those. This means subdividing the subd 0 mesh with the baked-down morph applied and work out the normal-based deltas between that and the full-detail morph. I've got all the ingredients at this point (except for the on-demand normal computation, which should be easy), so it's just a matter of putting it all together.
-- I'm not mad at you, just Westphalian.
Content Advisory! This message contains nudity
Okay, getting there. I've got a script that should do what I want but so far Poser isn't quite happy. So I'll be spending some time debugging. In the meantime, here's evidence that I can combine a subd-0 FBM morph with hi-res detail in a second morph to reproduce my original Blender sculpt. All that's missing is doing it all in a single morph.
ย
-- I'm not mad at you, just Westphalian.
Hmm, I'm going to need a test example that's a bit easier to verify visually.
Anyway, before someone else says it: yes, I realize that it's trivial to turn two morphs into one by linking them together. It's just a matter of automating things as much as possible to make the process easier.
I quite like the two-step process I used here, though, because Poser/OpenSubdiv took care of pretty much all the computations that would be difficult to do reasonably fast in Python, namely the computation of new vertex positions and normals for the subdivided mesh. So maybe the best course of action would be to streamline it as much as possible, ideally within a script that runs in Poser and does everything automatically.
So here are the steps, more or less:
- From the hi-res OBJ file out of Blender etc, grab the vertex positions that correspond to base (subd-0) mesh vertices and use them to make an FBM (we have various ways to do that, including ADP's script).
- With the FBM applied in Poser, go to subdivision level 2 and get the full set of vertex positions and normals from Poser (I'm assuming that can be done directly via the API, but making Poser write an intermediate file is also acceptable).
- Now use that and the original OBJ to generate the subdivision deltas and write out the PMD and PZ2 for the corresponding subdivision morph (assuming we can't make the subd morph directly in Poser, which would of course be much nicer).
- Now still within Poser, load in that new morph and link it together with the first morph.
- ...
- Profit!
-- I'm not mad at you, just Westphalian.
And finally, sweet, sweet success (in that this is the actual Antonia with a pimple morph rather than a fixed high-res prop impersonating her):
The script is not fast yet, nor is it running inside Poser, but it's werking. Huzzah!
Next up:
- Bake down to all the intermediate subd levels (currently only the base and highest level are done).
- Test on some more interesting morphs.
- Do a first optimization round, focusing on things that likely still need to be done in the Poser version.
- Convert into a "manual" Poser workflow by implementing the required operations that Poser does not yet support natively as small Poser Python scripts.
- Work towards full "one-click" automation.
- More optimization.
-- I'm not mad at you, just Westphalian.
primorge posted at 6:13 AM Tue, 4 January 2022 - #4432835
I know, right? Some weird little issues with this particular version of the head texture. Sorry for not fixing that and making your eyes hurt!Congrats.
Also
The eyebrow bump is misaligned with the eyebrow diffuse texture.
PS: I did fix the sclera bump though which I had set far too high in my default materials.
-- I'm not mad at you, just Westphalian.
Content Advisory! This message contains nudity
Behold, the days of half-baked subdivision morphs are over, as demonstrated here by my lovely assistant (who assures me she is not even a little bit baked, like definitely less than a third baked).
From left to right: subd-0 unmorphed for reference, morph (at full strength) at subd-0, subd-1, subd-2.
This took a generous 94 seconds on my machine for wrangling roughly 600k vertices. So for my next trick, I'll be savagely hacking away at that number.
-- I'm not mad at you, just Westphalian.
So far I've vectorized most of the number crunching stuff with the notable exception of computing the normals for the subdivided mesh. I'll get to that, but right now the biggest chunks of "extra" execution time are the parsing of the OBJ file and the translation of welded vertex numbers to per-body-part vertex numbers for the base-resolution FBM. The latter is of course something I'll get from Poser for free, but I'd like the standalone script to be fast, too, because command line scripts can be great for things like batch processing. For the same reason, I'd like to be able to read some of the face information from the morphed OBJ, as well, without too much of a performance hit. I don't actually need that face information directly, but I use it to identify the "ghost" vertices left over from welding.
Both of these things are "first thing that works" implementations as it stands, so I'd definitely expect there to be room for improvement.
Sorry if all that sounds confusing. Just ignore it if you're not interested in these details, and ask me to elaborate if you are.
-- I'm not mad at you, just Westphalian.
PS: I should maybe note that the "can be gotten from Poser for free but need a bit of work stand-alone" things could also be pre-computed and stored in files if necessary. So if they do turn out to be difficult to speed up, I'm prepared to just leave them be.
-- I'm not mad at you, just Westphalian.
primorge posted at 6:43 PM Fri, 7 January 2022 - #4432970
*shrugs*When all is said and done how are you releasing this?
Well, first of all github, obviously. The code already lives there, and since I reckon Microsoft purchased the company in order to milk rather than slaughter it, I don't think it will go away anytime soon. Github let's one make releases, so there'll be a link to an "official" zip-file that I could tell folks about if Rendo allowed that.
I don't know about other outlets. I could put it in FreeStuff if that's not too much hassle. Could also put a copy on the Antonia site, even though obviously it's not Antonia-specific. Any other suggestions?
-- I'm not mad at you, just Westphalian.
primorge posted at 7:39 PM Fri, 7 January 2022 - #4432973
Not suggesting that you should, but if you're not afraid of command line Python and can't wait for the Poser version, you can grab the code from github at (almost*) any time and just try it out. I haven't included instructions yet, but they're simple and I could add them quickly.Sounds good. Looking forward to experimenting with it. Watching with interest.
* I've got a habit of only checking in code that I've tried out and that appears to work. Of course I'm not infallible, hence the "almost."
-- I'm not mad at you, just Westphalian.
And we're down to 26 seconds at the end of this optimization day. I had some trouble speeding up the OBJ parsing code at first, but then I took ADP's advice seriously and wrote a custom function that ignores all the parts I don't need. It's also quite short, as a by-product of less Python code in a tight loop generally translating to faster execution times.
-- I'm not mad at you, just Westphalian.
Down to a much more reasonable 15 seconds now. Got the normal computation much faster by heavily Numpy-fying it and finally understood properly how welding and ghost vertices work in Poser, which made the unimesh to actor translation for the base level morph pretty trivial.
The OBJ parsing now takes five seconds and almost all the rest is to do with subdivision. I think I'll give it one more day for optimizing the latter and then start on the Poser integration.
-- I'm not mad at you, just Westphalian.
Daily update: no new optimizations today. I couldn't get that "one more thing" to work and called it quits. Instead, I tested my code with Python 2.7 (I seem to recall someone requested 2.7 compatibility ) and fixed a few small problems which that unearthed. I also started my exploration of the relevant portions of the Poser API.
Interesting tidbit: if I import my test-OBJ (morphed Antonia at subd level 2 with ca. 600k vertices) from the Poser Python shell, it takes 5 seconds, which incidentally is exactly the same time my command line script takes for loading that OBJ (although my script does not read in the UV coordinates, so it's not a fair comparison). When I import from the File menu, thought, it takes Poser 12 seconds. Any idea where the extra 7 seconds come from?
-- I'm not mad at you, just Westphalian.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
I don't think it has to be complicated to use in the end. I just personally find it easier to hammer down the functionality before I worry about UI/UX.
Yeah, part of the reason I keep at it is because there are interesting little challenges in the programming that I can learn from, and I can probably reuse parts of the code in other projects. Being able to use not-ZBrush would be nice, but if it turns out to be too hard, well, I guess I'll bite the bullet and re-lean ZBrush.
Oh, I think there's a misunderstanding here. Eventually, I'd like to run everything within Poser, not via an external script. Then ideally the user would just select a figure, pick the desired subdivision level and load the OBJ for the morph. All other necessary information can either be obtained directly from Poser or computed by my algorithms. Since we can't set subd deltas directly in Poser (as far as I'm aware), we'll have to then export a PMD, but as you mentioned above, we should even be able to then load that into Poser, so the morph would be available right away after the script had finished. And I agree, it's probably better to just have a dedicated script just for hi-res morphs, nothing else.
Like I said, all of these should become unnecessary. The whole thing should eventually just work like "Load full body morph...".-- I'm not mad at you, just Westphalian.