Forum Moderators: Wolfenshire Forum Coordinators: Anim8dtoon
Animation F.A.Q (Last Updated: 2024 Nov 13 3:03 pm)
Characters, motion graphics, props, particles... everything that moves!
Enjoy , create and share :)
Remember to check the FAQ for useful information and resources.
Animation learning and resources:
11 Second Club: Monthly character animation competition.
Animation Mentor: Online school. Learn from the animation masters.
Rigging 101: Maya rigs and rigging tutorials.
AnimWatch: Showcasting the best of independent animation.
FlashKit: The best place to learn Flash.
Armaverse: Stop-motion armatures for animation.
60+ great Character Animator's sites: Get inspired.
it's ingenious! my experience with IR-videocams indicates that 8 - 12 feet is the optimal range for those LEDs, which might be good for an entire human body, depending on focal length of the lens. I'm assuming they're using cameras with moderate light sensitivity (e.g. 0.5 lux) so that they get good contrast between the balls and the scene (greyscale).
From what I saw, it looks like you get an SDK, one camera, and a set of IR reflective spots. there doesn't seem to be any program FOR it. The thing that catches my attention most, is that, since it's optical, you're GOING to need at least three cameras. By my calculations, by the time you're done with developing the 'ware, and getting more cameras, you might as well go with something more developed.... something with it's own software. Lou.
"..... and that was when things got interestiing."
Even if you buy an additional 3 cameras ($900) it still would be very cheep. Investigating further I find that you will indeed have to write the program to record and use the data this produces. This is a major problem or could be a major opportunity depending on how you look at things. If you were a programmer and could write a front end to get the data into Life Forms or Motionbuilder you would have produced a much needed program and can cash $$$ in on it. For somebody like myself this product is useless, even thought very cheep, until someone does this and markets it. Im sure the market would be huge for a low cost personal motion capture device. Hummmmmm...I think I'm going to pick up that QBacsic book again and start learning...this could be that million dollar product I was looking for...now where did I put that book...
Hello,
I am developing an exactly such application.
http://www.geocities.com/mocap_is_fun/
It can capture a single person full body motion and output to bvh format. It is still in development, tracking is failing when motion is too quick.
any chance you can show a "behind the scenes" kind of thing?
nemirc
Renderosity Magazine Staff Writer
https://renderositymagazine.com/users/nemirc
https://about.me/aris3d/
Thoes cameras have no software support. . Syntheyes has mocap support and is only 400 dollars.
Yoshi; I would love to help you with your programming but I have no "Class"
Have you tride strobe lights, filtered with a lens to match the collor of the markers? That might remove most of the false readings and alow faster tracking.
Dale
Quote - Thoes cameras have no software support. . Syntheyes has mocap support and is only 400 dollars.
Yoshi; I would love to help you with your programming but I have no "Class"
Have you tride strobe lights, filtered with a lens to match the collor of the markers? That might remove most of the false readings and alow faster tracking.Dale
Yeah, I meant the tracking fails when motion is too quick and markers travel further distance than my algorithm expects, not so much about the OptiTrack camera failing to detect the markers.
"similarity" may refer to the problem that the algorithm confuses nearby markers with each other, if the distance traveled by a marker between frames is >= to the distance between nearby markers. hence dale's technique. it might be tedious at 120 frames per second (or faster) and a minute of motion per file, but at least an human can tell which marker is which with less difficulty than a computer, if all the markers are identical in size, shape, labelling, colour, etc. one might need "smart markers" to more easily automate that, or else make sure the frame rate was fast enuff to keep marker movement small, compared to marker spacing.
Quote - Sorry, just trying to help. Syntheyes has the same problem with very fast tracking, so the program allows you to track points by hand.
Dale
Have you actually tried to use Syntheyes to capture human motions? It looks difficult according to some of the posts in their forum. > Quote - hence dale's technique. it might be tedious at 120 frames per second (or faster)
and a minute of motion per file, but at least an human can tell which marker is which
with less difficulty than a computer, if all the markers are identical in size, shape,
labelling, colour, etc. one might need "smart markers" to more easily automate that,
or else make sure the frame rate was fast enuff to keep marker movement small,
compared to marker spacing.
It is already possible to manually identify the markers. But as you say it can be tedicous soon if you have to do it often.
This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.
Attached Link: http://www.naturalpoint.com/optitrack/index.html
Has anyone seen this advertisment? It certainly is cheep enough but I believe you need to program a front end for it to work with MotionBuilder or Life Forms. Hummmm...looks interesting!