Optical motion capture

Christopher VaughanChristopher Vaughan ✭✭✭✭✭ ✭✭✭✭✭
This is a proof of concept of a under $1000 optical motion capture rig that I built. I used two cameras and a motion tracker to provide the data to Maya, I then interpreted the tracking points along the surface of the 3d skin. There are no skin weights on the model.

Comments

  • Tom StewartTom Stewart ✭✭✭ ✭✭✭
    Wow! You should do a class on this!

  • Christopher VaughanChristopher Vaughan ✭✭✭✭✭ ✭✭✭✭✭
    thanks!

  • David BoccabellaDavid Boccabella Moderator Brisbane, Australia Moderator
    Really Good.  Why 2 camera's though.
  • This is really interesting; I'd definitely like to see more about how you went about doing point tracking and actually mashing the data into Maya.
  • Christopher VaughanChristopher Vaughan ✭✭✭✭✭ ✭✭✭✭✭
    Thanks! The point tracking was done with syntheye for post tracking (much more accurate) and simplecv for realtime rehearsal tracking and since simple CV is written in python it was no problem to port it to pymel. 

    The tracking points produced null locator data that I used and an algorithm (several actually)  written in mel to take that locator movement and parent it to bone (that i could adjust on a granular animation level). The Bones are point constrained to the surface with follicles from the hair system. and voila..
Sign In or Register to comment.