PDA

View Full Version : Using custom rigging with marker data...


IchI
01-04-2009, 08:11 PM
I did some motion capture a while ago for a small project I had using my University facilities, I took the easy option and used built in 3dsmax feature and mapped the marker data to a biped. This involved specific marker set-up and names, but once I knew the locations and names it was fine.

I'm now doing my final year project at University as well as recently pushed into custom rigging. With my knowledge of custom rigging I wanted to really push into how to do good motion capture, the problem is I'm stuck as to work flows and typical industry standards.

To me it seems obvious that the industry must use custom rigs for motion capture as they use it for facial animation. The problem is I don't understand how or what application they typically use. I currently has access to Evart at my University but once you have captures and cleaned your marker data. What does the industry typically do with it from there? Specifically with Facial and Hand marker data. How do they map it to a skeletal rig and what application do they typically do this in?

I'm currently trying to use MotionBuilder but was wondering if its possible to make a custom rig inside of 3dsmax complete with custom controllers, scripts and reactions and then use that rig to map the data onto it?

I have a feeling I could strip everything from my custom rig down to the bones, import that into MotionBuilder and then map my marker data onto that. But what happens when I have to make changes? Do I simply have to manually do them? This is more of a problem when it comes to facial animation as making a slight change without controllers could become... time consuming...

Sorry to ask so many jumbled up questions, I just want to be able to leave University and have confidence writing "I know what I'm doing with motion capture" and not end up been fired when asked a simple question.

To sum it up, what are typical pipelines for marker to skeletal data, as well as limitations and typical exceptions?

PEN
01-06-2009, 08:41 PM
I have answered this question several times, have a search around and I think that you will find information about it.

I have created a system where I can apply TRC data directly to any rig, I convert it by building matrix values from the markers and apply that to the controls.

You can also go the root of using BVH or HTR data and then taking the rotation data from them and applying it to your custom rig.

IchI
01-08-2009, 04:24 PM
I see, thank you very much. In the games industry is it common practice for them to use these types of work flows? I suppose the major question I was hoping to get answered was is what you say the mainstream way of doing it?

Dimich
01-09-2009, 05:04 PM
There really isn't such a clear cut understanding of what is industry mainstream way as in every company there are people figuring the most useful set-ups that will work for them in terms of the animation needs. When it comes to mocap, you gotta answer yourself first, do you need a custom rig? Are you dependent on little time and low resources? Depending on those answers you will be able to decide what is the best way to go for you.

If you decide to use biped, there are many ways of applying mocap to it, most are pretty much built in.

If you are using a basic skeletal set-up for a human, using MotionBuilder can ease you out of the need to create complex rigs. It has a very complex control rig but handles Mocap better then any other app so far. But the animators will generally hate your guts for it as it is a pain to use and is far from being intuitive. But for editing mocap it, once again, works wonders.

If you go the Custom Rig way, like Paul said, there are ways that you will need to go by using some scripting to retarget the data from the mocap. Some companies find different ways to do this. Guys at Naughty Dog did a triple skeleton set-up: 1 for the mocap, 1 for the animators (control rig), and one that would have all the animations baked to it and exported into the engine. Ubisoft used Motionbuilder for Assasins Creed. Here at Crytek we mainly use biped and MotionBuilder for these purposes. You can also apply the mocap data to just the controls on the control rig, add layers to overwrite and adjust that animation. But that will require you to understand quite well what you are doing and will take you quite some time, especially if you are interested in having that system work with a game engine and be usable by others besides yourself.

Hope this helps you out:)

IchI
01-10-2009, 11:18 AM
Thanks very much, that answered many questions I had in my head. I have been using my University facilities to get experience in motion capture and I was tempted to see if I could make something of it, although my main skill is environment art, motion capture is something I enjoy doing a lot. Hopefully your wise words will help me prepare for any possibilities. Thank you :D

CarlosA
01-11-2009, 09:44 AM
Hi ichi,
It's funny that you ask this question(dealt with this a few months ago), when I was developing the animation pipeline for the Tron 2 teaser at Dd, I was asked to use biped due to the show being 90 percent mocap. I knew there would be a need for some key framing so I also made custom riggs for the characters so that the animators doing key frames (which were maya animators using max for the first time) wouldn't have to learn biped.

Sure enough before we knew it the show took a turn and what was supposed to be mostly mocap became mostly animation which ment using biped to animate was out of the question, and what ever mocap we had made for the bipeds needed to be transfered to the custom rigs.

This actually ended up being a really easy task.
this are the things i have learned from working at different studios, where we had to deal with mocap.

1)have fk in your production rigg
You can transfer mocap to an all Ik rigg but it's more work.
and your tracker data will end up as rotation keyframes on a fk skeleton anyway.

2)build a simple skeleton with your final joint placement and naming
Before you target and clean up the capture data.

I set up the biped riggs for the mocap shoot before the actual session.
Which meant the mocap company had the proportions of our characters to work with.
When I set up the custom riggs all I had to do was bake the animation on to the custum rigg and sence the joints were in the samplece there were no sliding feet or offset to fix.

naming is very important to.
if i take the time to give the mocap company a skeleton with the proper naming.
i can then build my deformation rigg on top of that while the mocap is being done.
and when i get back the skeleton with the mocap it will have the same name as the one i'm currently using, which means i can use "save animation" tools to transfer the curves from the mocap skeleton to my production rigg(which means no scripting needed).

3)the best delivery format you can get from a mocap company is a scene file with a fk skeleton with the animation already baked on to it .

While this companies can deliver all types of fancy formats, all you need is a simple structure to build your deformations on. which they need as well in order to properly convert the marker data in to something they can hand back to the client. they will usualy just spit a simple fbx file out of their software which is then opened and resaved in what ever software they are delivering in (maya max what ever..)

most companies that deal with mocap follow this simple rules.

I hope this helps some.
Cheers,
Los

PEN
01-12-2009, 01:25 PM
Just the fact that there wasn't a standard is why I created my character system that converts the mocap data from TRC markers and not from rotational joints. Carlos made the point that converting to IK can be a problem and since most people are converting from BVH or HTR then it is already rotational. Using TRC allows me to get more accurate results with the IK solutions if you need them. The position of the hand can be targeted instead of the angle of the joints. This also goes for the legs and since animators are almost always wanting to work with IK legs when a character is on the ground this makes it far easier to clean up and continue hand animating on. What I have done is created a system sort of like Motion Builder in some ways how ever I can build any rig and have the character system drive it much easier. Also it can work how ever the animator would like and I'm not constrained to specific rigs like with biped or CAT.

CGTalk Moderation
01-12-2009, 01:25 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.