odf opened this issue on Dec 21, 2021 ยท 11 posts
odf posted Tue, 21 December 2021 at 7:58 PM
Pretty self-explanatory topic, I guess... :smile:
I'm trying to write a script that makes PMDs for subdivision morphs. The tricky bit is that the deltas do not encode displacements in the x-, y-, and z-direction like regular deltas, but they are given with respect to a local coordinate system that is different for each vertex.
The first direction, let's call it n for normal, is an approximation of the vertex normal. It can be a bit off, which could be a hint that it's interpolated rather the re-computed from scratch.
Of the remaining directions u and v, it seems that one always goes along an edge that's incident to the current vertex. How that edge is chosen, and how it is determined whether its direction will be u or v, I don't know.
I'm still experimenting and hope to find out more in time, but if anyone happens to know the exact method that's used for this in Poser, I beg you to have a heart for a poor confused mathematician and tell me.
-- I'm not mad at you, just Westphalian.
odf posted Wed, 22 December 2021 at 1:55 AM
Okay, after some more experimentation, it looks as if the u- and v-directions are actually aligned with the texture coordinates. Knowing that is going to help a lot, although I'm not sure it's quite enough.
-- I'm not mad at you, just Westphalian.
adp001 posted Thu, 23 December 2021 at 10:24 AM
How I proceeded:
First, I loaded the standard prop "one sided square" into Poser. This consists of only four vertices that form a rectangle. I set the prop to subdivision level 1. Then I took Poser's morphbrush and moved the upper middle vertex (generated by Sub-D) to create a morph called "test". With "save binary" turned off, I saved the prop. I loaded the result into a text editor and searched for the entry "test". There I found this:
targetGeom test
{
name test
initValue 0
hidden 0
enabled 1
forceLimits 1
min -100000
max 100000
trackingScale 0.02
masterSynched 1
keys
{
static 0
k 0 1
}
interpStyleLocked 0
indexes 0
numbDeltas 4
targetExtendedInfo
{
subdivDeltas 1
{
level 1
{
indexes 1
numbDeltas 9
d 7 0.01 0.0 0.0
}
}
}
blendType 0
}
If you change the entry in targetExtendedInfo, the newly loaded prop behaves similar to any standard morph. And you can add entries if you like.
You can download my test file here and try it yourself: https://adp.spdns.org/xyz.pp2
adp001 posted Thu, 23 December 2021 at 10:36 AM
Apparently you can load Sub-D morphs indirectly like any other morph without having to go via PMD. It's a pity that you can't reach the morphs via Python. This prevents the use of tools (e.g. to "clean", correct, adjust or mask morphs). And it makes handling quite slow if you first have to create an intermediate file that Poser can process instead of directly processing an OBJ file.
adp001 posted Thu, 23 December 2021 at 10:50 AM
Seems like Poser stores the SubD-0 data first and then the hires information right after that.
Example: Geometry has 100 vertices. For Sub-D-0 morphs index 0-99 is reserved, from index 100 the Sub-D-1 to n level information starts.
odf posted Thu, 23 December 2021 at 4:02 PM
How I proceeded:
First, I loaded the standard prop "one sided square" into Poser. This consists of only four vertices that form a rectangle. I set the prop to subdivision level 1. Then I took Poser's morphbrush and moved the upper middle vertex (generated by Sub-D) to create a morph called "test". With "save binary" turned off, I saved the prop. I loaded the result into a text editor and searched for the entry "test". There I found this:
That's more or less what I did for my experiments. I create empty morphs from within the morph tool and save the CR2 with binaries turned off. Then I set the subdiv deltas to some defined values, load the result into Poser and see where the vertices go. I've also exported OBJs of the mesh with and without the morph to verify that the first delta direction is the normal.
The remaining big problem is to figure out how the u and v directions are computed. From visual inspection (on Antonia low-res) they seem aligned with the texture u and v but they are not generally orthogonal to the normal direction. Without knowing the exact method for determining those directions, the morphs will be off wherever the surface is not completely flat.
If the directions are computed on the base mesh and then interpolated (rather then recomputed from scratch) for the higher subdivision levels, one could potentially just take the measurements once for each mesh and then feed that into the script.
ETA: I hadn't tried to save just the prop. It's good to know that that also works.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 23 December 2021 at 4:04 PM
Yes, I think it's original vertices, then face centers, then edge points. Face centers and edge points seem to go in the order they occur in the face data (or the OBJ, which is what I looked at).Seems like Poser stores the SubD-0 data first and then the hires information right after that.
Example: Geometry has 100 vertices. For Sub-D-0 morphs index 0-99 is reserved, from index 100 the Sub-D-1 to n level information starts.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 23 December 2021 at 4:31 PM
I'm taking a little break from this now to work on other parts of the script. But the plan for the next step is to map the texture space left, right, up and down directions at a vertex onto the embedded mesh and compare them with the displacements Poser produces when I use something like [0, 0.001, 0] or [0, 0, 0.001] as the delta.
-- I'm not mad at you, just Westphalian.
odf posted Thu, 23 December 2021 at 4:32 PM
I create empty morphs from within the morph tool and save the CR2 with binaries turned off.
Sorry, that was my scatterbrain talking. I meant PZ3, not CR2.
-- I'm not mad at you, just Westphalian.
adp001 posted Thu, 23 December 2021 at 7:29 PM
odf posted at 4:31 PM Thu, 23 December 2021 - #4432342
Ok. I did the same :)I'm taking a little break from this now to work on other parts of the script.
odf posted Fri, 24 December 2021 at 8:17 PM
Just a quick note: it seems that the numbering for the vertices of a subdivided mesh is a bit more complicated than I thought. I had a go with Antonia and not even the numbering of the original vertices is stable when subdividing. A possibility is that the numbering is done per actor, or maybe the infamous ghost vertices play a role. A bit off-topic for this thread, though, better to discuss it in the other one.
Happy Isaac Newton's birthday!
-- I'm not mad at you, just Westphalian.