PDA

View Full Version : 2d facial tracking converted to morph animation


splinterD
01-08-2011, 07:31 PM
hey everyone
first let me say how excited i am to post this little test , because i am no were close to being a rigger
and i was struggling with this idea i had for a wile until i decided to try it .
so here is the flow chart

http://lh4.ggpht.com/_-L4FGjeiWdQ/TSjD7sc19_I/AAAAAAAAAP0/g037CjX6sak/tut.jpg

i made a helmet cam from a baseball cap
tracked the footage in AE opend it in 3dsmax with max2ae and converted the connected the
trackers to the face gui with reactions , that way i can control the deformation , and add layers of animation to the gui as i please .

you can watch the test render here (http://danieldulitzky.blogspot.com/2011/01/2d-face-traking-converted-to-morphs.html)
i hope you like it

pooby
01-08-2011, 09:08 PM
Nice work. It works well.

I recently did a similar-ish test with syntheyes and xsi. The main pain I found is in the tracking, as i had a lot more markers.
Zigntrack2 might soon be able to do the tracking automatically and export the 2d data soon.

splinterD
01-09-2011, 03:11 AM
sweet!!!
thanks for the reply , love your work by the way
im trying to keep it to the bare minimum with the trackers
and in the setup get some extra information from distance\angle of some trackers with
other trackers , so far some work and some are causing artifacts .

splinterD
01-09-2011, 11:39 AM
http://www.youtube.com/watch?v=aDNWBP0BZwUhere is the video in case you missed it

http://www.youtube.com/watch?feature=player_detailpage&v=aDNWBP0BZwU

splinterD
01-11-2011, 05:37 AM
hey i was wondering if someone can give me a simple way to get a distance between to objects
and have that value plugd in to the reaction manager .
what i did so far for example to get the distance between the bottom lip track and the chin was
to create a dummy offset it to the side of the chin marker, connect it to the chin and drop a look at constraint pointing to the bottom lip marker , then i tried to have the rotation value of the dummy exposed with an "expose tm" helper , so when the lip got close to the chin the rotation value would drop
and i connect that to the "lip roll" morph controller in the gui .

i tried to do the same process to get the angle between the outer eyebrow marker and the inner marker,
and connect that to the "sad" and "mad" morphs of the eyebrow .
but that gave some jerky results .
is there a more precis way of getting distance and rotation values out of objects to connect to the reaction manager ?

spacegroo
01-11-2011, 01:48 PM
Sounds similar to what ImageMetrics offers, though they apparently track video footage with no markers. What process did you use to derive your morphs from the 2d point cloud? Lastly, did you attempt to drive a bone rig with the 2d point data? I'm curious if you'd get more nuanced results (assuming a robust enough rig).

splinterD
01-11-2011, 04:07 PM
once i transferred the position data from the tracking markers to solid layers in after effect
then i imported them in max with "max2ae" , then i had the gui setup ready with reaction connections to the morph modifier , so when the controller for the jaw was pulled down the "open" morph would go ..lets say to a 100 .
then i did the same thing with the tracking points , i found the extreme positions of the jaw track , and use position y value to tell the jaw controller to pull down to the point i wanted it to .
i didnt try bones because i really am no good in rigging , and i hate skinning :)
it just seemed more efficient to use morphs because i can make the open moth look
just like i want it and play with the values as necessary , and once ill get to a gui setup that ill be happy with it can stay the same no matter the shape of the face , and i might even be able to write a script to stream line the connections process , although im no good in scripting either . oh yea and i got some extra details from measuring the distance between tracked points to connect to extra morph targets without needing to have gazillion tracking points . this process was made for none realistic faces , not benjamin botton style effect , but for more stylized cases.
if you have any more questions ill be happy to answer , and maybe someone will help me in turning it into a nice script :)
i hope this help .

thehive
01-14-2011, 01:02 PM
thats pretty cool man.

eek
01-15-2011, 04:02 PM
Sounds similar to what ImageMetrics offers, though they apparently track video footage with no markers. What process did you use to derive your morphs from the 2d point cloud? Lastly, did you attempt to drive a bone rig with the 2d point data? I'm curious if you'd get more nuanced results (assuming a robust enough rig).

They actually do shape analysis - and software that will 'learn' what action units the face is hitting. Plus *cough* lots of manual work.

ziomalbanan
01-17-2011, 09:47 AM
once worked with that facet racking, but face control not be good. you doing this very good! perfect connection face tracking and face control:) this saves a lot of time:)
also head and blend shapes you doing great. it's look nice.
great job!:thumbsup:

splinterD
01-18-2011, 07:37 PM
thanks allot !!!
it was allot of fun working on , making the hat and everything
i hope ill get to do another test soon , just a better one.
any thoughts about what i should look for on the next test to push it up another level ??
anyone ???

CGTalk Moderation
01-18-2011, 07:37 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.