Forum Moderators: Staff
Poser Technical F.A.Q (Last Updated: 2024 Nov 13 12:50 am)
Welcome to the Poser Technical Forum.
Where computer nerds can Pull out their slide rules and not get laughed at. Pocket protectors are not required. ;-)
This is the place you come to ask questions and share new ideas about using the internal file structure of Poser to push the program past it's normal limits.
New users are encouraged to read the FAQ sections here and on the Poser forum before asking questions.
primorge posted at 10:07 PM Mon, 17 January 2022 - #4433442
Yeah, I'm getting the same warning. Haven't been able to figure it out. There seem to be no zero dividers, infinities or not-a-numbers in sight, but still numpy is not happy. Well, maybe one day I will know......there is a warning message in there but the morph loaded fine. I'm not seeing anything wrong whatsoever.
Also, good thing I spent so much time optimizing. Imagine that number at the end multiplied by seven. Yikes! Only 15 seconds for Antonia at subD 2 on my machine, by the way.
-- I'm not mad at you, just Westphalian.
Yeah, I'm getting the same warning. Haven't been able to figure it out. There seem to be no zero dividers, infinities or not-a-numbers in sight, but still numpy is not happy. Well, maybe one day I will know...
Actually, scratch that. It was just plain and simple division by zero. It's related to the isolated vertices, which get assigned a zero length normal. Anyway, my "fix" for the warning message wasn't one. I had an incorrect mental model of how numpy works. So the next version will get rid of the warning, but it really doesn't matter for the result.
-- I'm not mad at you, just Westphalian.
Now the question I would ask next is why the hell, in all these years since SubD morphs have been a feature in Poser, has no one come up with this solution until now? Bit peculiar I'd say...
I mean, it took me several weeks of full-time work, and there probably aren't that many people who'd do a project like this for funsies in their spare time. Obviously, it would have been much easier and quicker to implement as a build-in Poser feature, but still Bondware would have to pay someone for doing it. Maybe with GoZ and the Morph Tool available, they didn't think it was a big priority.
-- I'm not mad at you, just Westphalian.
Was tempted to say something about monetary considerations and DS's locked down implementation of SubD morphs but probably that would be pushing the envelope of propriety a bit. Regardless, it's really something that you've opened up HD Morphs for everyone in such a way... I would have a slightly unnerved feeling creeping in the background if I were in your position, especially since the silence is roaring, but I'm teetering on the edge of tin foil hat donning ;)
Congratulations again, and sincere thanks from this Poser user.
@primorge: You would have to strain the protection of IP quite a bit if you wanted to accuse odf's script of anything.
The script is not based on reengineering. It was enough to know that Poser's algorithm is based on OpenSubDiv (like most software that uses subD), and that both the source code and detailed descriptions of it are freely available on the net.
The fact that nobody has done such a script before has three main reasons: First, nobody with a plan in the Poser universe how Subdivision works; Second, due to other existing tools not worth the effort; Third, because for the reconstruction of the HD morphs more is needed than just knowing how to program with Python - namely exactly what odf's field of expertise is and for which one needs an innate talent besides solid math knowledge.
For me, there's also the fact that I don't use HD morphs in Poser. Where details are required, bump/normal and displacement maps are completely sufficient for me. This is probably also because I need to be able to animate my figures.
@primorge: You would have to strain the protection of IP quite a bit if you wanted to accuse odf's script of anything.
I was thinking more along the lines of less talented shady characters appropriating his script for personal financial gain... given odf's track record he seems to circumvent this by making everything open source with credit. So... moot.
For me, there's also the fact that I don't use HD morphs in Poser. Where details are required, bump/normal and displacement maps are completely sufficient for me. This is probably also because I need to be able to animate my figures.
I agree. Let me clarify though... from a personal perspective workflow, if I'm going to be doing a final showpiece render I'll often morph and pose the figures as close as possible to my final vision and weld everything into static props. Clothes, hair, figures. I'll then do a higher resolution beauty pass sculpt of those props externally for the final output. I'm good at asymmetrical sculpting from my practice in traditional sculpture in clay. It's then just a matter of rendering. Sounds like a lot of work I know but it's a method I've gotten used to. Being that I prefer to render with Firefly, because of its flexibility and, yes, habit and ingrained workflow, displacement maps are also very useful. There's no denying that Displacement is also resource economical.
The fact that we now have LOD morphs, which, thanks to ODF, can be created in virtually any sculpting software, can be previewed real time and also mixed in real time via dials to arrive at results that aren't specifically planned (eg. Improv) is just additional flexibility in such a workflow. It's gravy. Besides the mixing and realtime aspects of HD morphs there's also the cool option to release sets of such morphs via pmd INJ pose very easily. I'll be honest I'm not a big fan of Zbrush's GUI (which is ironic considering it's one of the first 3d softwares I started with back in version 2), but really like some of it's plug ins, and much prefer to sculpt in MudBox or Blender because of the more "conventional" interface. Never thought I'd say that about Blender ;)...
Anyway, just wanted to give a glimpse into my perspective on this in light of your comments, for clarification.
The fact that we now have LOD morphs, which, thanks to ODF, can be created in virtually any sculpting software, can be previewed real time and also mixed in real time via dials to arrive at results that aren't specifically planned (eg. Improv) is just additional flexibility in such a workflow. It's gravy.
Exactly how I see it.
I thought maybe I'd summarize the method I ended up with for making the morph, just in case someone's interested.
1) Starting with preparations: in addition to loading the hi-res OBJ file with the morph, I grab two "base" meshes from Poser, both at subD 0, an unwelded one with all the relevant actors and a welded one with isolated vertices intact, so that the numbering between the two meshes matches. Then I do some renumbering on the hi-res mesh to ensure it has the isolated vertices at the same indices as the welded base mesh.
2) Next I bake down the morph to subD 0. Let's say the morphed hi-res mesh is at subD level N, then I subdivide the welded base mesh N times and compare the positions of the subD 0 vertices in the two subD N meshes I now have. These differences provide my level 0 deltas.
3) The next step is just a regular FBM construction using the deltas from step 2. I get the actor information from the unwelded base mesh. That's not the optimal way of doing it, but since I started with a command line script that uses an actual mesh loaded from an OBJ file to get actor information from, it was the most straightforward way.
(... and my lunch just arrived, so I'll leave step 4 - the actual subd part of the process - as a cliffhanger.)
-- I'm not mad at you, just Westphalian.
Okay, I have achieved nourishment. Onward.
4) In order to get higher level deltas, I first apply my subD 0 deltas to the welded base mesh and subdivide it once. That gives me a partially morphed subD 1 mesh. I repeat the baking down process from step 2 with that new mesh as my base mesh to obtain "raw" level 1 deltas. From these I construct proper, but sadly only approximate level 1 deltas by taking only the displacement along the surface normal into account. Applying the "corrected" deltas to my partially morphed subD 1 mesh gives me a fully morphed subD 1 mesh, and I repeat the process to construct level 2 deltas and so forth until I've reached level N.
-- I'm not mad at you, just Westphalian.
Some notes:
- Maybe it's good to be aware that the subD 0 deltas live in the body part actors, whereas all higher level deltas live in the BODY actor. According to some highly unscientific experiments I did, higher level deltas in body parts as well as subD 0 deltas in the BODY seem to be ignored. This makes sense to me since subdivision as such requires a unimesh figure, but who knows if there isn't some secret trick to stuff a, say, subD 1 morph in the head actor.
- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations.
- While writing down the steps, I remembered that there is a different, possibly better "baking down" method that I had meant to try, but completely forgot about. I don't want to get into too much detail about it, but the general idea is that one can reverse the subdivision process to produce a lower-res mesh from a higher-res one. I have an old Scala implementation of that which I'd used to "downsample" Antonia morphs, but haven't gotten around to porting it to Python yet.
-- I'm not mad at you, just Westphalian.
"- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations."
I'll do some comparison examinations at close scrutiny and tell you if I notice any significant differences between the sculpt and the morph output, I'll PM you if I see anything unusual. Forum interactions give me generally negative vibes.
"- As mentioned before, the morph I end up with is an approximation. Experience will have to show how good it is at recovering the original sculpt in practical applications. Then again, normal maps and displacement maps are also just approximations."
I'll do some comparison examinations at close scrutiny and tell you if I notice any significant differences between the sculpt and the morph output, I'll PM you if I see anything unusual. Forum interactions give me generally negative vibes.
That would be very helpful. I can't promise to fix whichever issues you may find, but knowing what can go wrong will certainly help with the motivation to look deeper into this. My feeling is that things like finer skin folds will be among the features most likely to be a little off. Or any fine details that don't displace along the surface normal, really.
-- I'm not mad at you, just Westphalian.
"That would be very helpful. I can't promise to fix whichever issues you may find, but knowing what can go wrong will certainly help with the motivation to look deeper into this. My feeling is that things like finer skin folds will be among the features most likely to be a little off. Or any fine details that don't displace along the surface normal, really."
I see. You did have me concerned for a second. I don't foresee ever using Poser subdivision beyond 3 levels so I probably won't encounter such miniscule differences. I think it's best, or most practical live in Poser, for "gross" anatomical details such as boney protusions and musculature details, large veins perhaps, etc. Everything else seems like the practical realm of various height maps. Glad you clarified that. Your MD efforts look good btw...
As an aside, even though it's not advertised as such, the script can also load regular FBMs from .obj files. I just had a case where Poser's "Load FBM from file" produced an exploding mesh. Probably caused by isolated vertices missing or some such, but then my script loaded the morph perfectly.
-- I'm not mad at you, just Westphalian.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Well, at least this was one I could not have caught without actually running P11. Try again.
-- I'm not mad at you, just Westphalian.