Cage opened this issue on Dec 20, 2006 · 1232 posts
Spanki posted Thu, 11 January 2007 at 2:24 AM
Quote - The vert-to-poly comparisons currently result in a memory leak, the cause of which is still uncertain.
Yeah, but isn't the memory leak due to trying to store lots of lists of data about the polygons/edges? It seems like you could just compute that info on the fly as you needed it.
Quote - I have some Big Questions right now which keep getting lost in discussions of theory concerning the details of comparing meshes with incompatible shapes. I'm not surprised my questions are the least interesting in the thread. :-P Unfortunately, they're also the questions most pertinent to the actual script, until and unless someone else takes over. :( If I can't resolve some of these questions, I can't move the project forward to the point where details of mesh comparison will even be an active consideration.
Right now the two biggest questions are: why does this RAM leak occur, and what can done about it? And: is there a way to calculate vertex weights using vertex-to-vertex comparisons, so as to avoid the RAM leak which accompanies the vert-to-poly comparisons?
Yeah, sorry for all the side-tracking. But you seem to be trying to hang on to / salvage a method that's causing you troubles. I still need to get the time to study the existing scripts better to see where/why the storage is needed, but personally, I think I'd just try to compute the data as needed and not worry about storing it.
I understand that you'd like to preprocess the meshes and come up with a reusable vertex correlation using some automated process, but (from my perspective) I've already discounted that as a valid thing to do (given my math abilities and due to shape differences). Or at least... wait until you have some methodology for solving the differences issue - as mentioned earlier, none of the vertex-correlation methods being discussed really do anything to address the larger shape differences issue.
Quote - I'm trying the Spanki weight-calculating method for triangles within the context of svdl's suggestion about weighting using a vertex approach. I've tested using a spherical range check to find MeshA verts which should be weighted for a given meshB vert. The radius of the sphere is substituted for the triangle edge in the Spanki code. So far, this works to give me weights, but it is obviously incorrect, returning no correlations in many cases and returning groups of correlations which have a total weight above or below one in many other cases. But something along these lines may go somewhere.
I've thought about that a little bit, but don't have any answers yet. The method I proposed always gives you 1.0 for the sum of the weights for the 3 vertices of the triangle. But if you're just grabbing N close vertices (within some radius), you lose the context of the surface above the point being checked, so the math at least is different (I need to think about how you can do the averaging). As you noted, doing a simple average based on the number of vertices within range doesn't work.
Cinema4D Plugins (Home of Riptide, Riptide Pro, Undertow, Morph Mill, KyamaSlide and I/Ogre plugins) Poser products Freelance Modelling, Poser Rigging, UV-mapping work for hire.