PDA

View Full Version : Facial Rig Script - Would This Be Useful??


AdamEisfeld
02-15-2011, 06:33 PM
Alright Ive been reading up on all the material I can about facial rigging, I've gone through both stop staring books by Jason Osipia, and I got to thinking that I might be able to write either an external program or a MEL script that would greatly speed up things. Please let me know if something like this already exists or if you'd find it useful or if the functionality is already built into Maya and Ive just missed it:

Essentially, from what Ive seen, the best facial rigs include a few things. The typical blend shape setup that any facial rig should have, controlling different portions of the face. Thats pretty common day. The second portion is hooking up these blend shapes to a 2D GUI built from curves to allow for easier combining of shapes for the animator, something akin to Jason Osipia's fairly well known set up he teaches in Stop Staring vol 1 and vol2. The third aspect of a good facial rig is to have the ability to fine tune specific areas on the face by moving around controls placed directly on the surface of the face, to tweak things like the lips, brows, cheeks, etc.

Now from what I understand / have experimented with, getting this third part Ive described set up takes some time. I was thinking of writing a script that would automate this process. Essentially you get your face set up with all of the blend shapes you'll need, then you place down control points on the surface of the face representing the places you would like to have the ability to fine tune (as previously mentioned, placed like the eye brows or lips). Once you've finished placing these points down, you go through each point and specify which vertices you would like the point to affect (havent decided how Ill do this yet, either by painting the weights with Artisan / my own system, selecting verts, or maybe just adjusting a fall-off radius for each point). When thats done, you simply press a button and the script will go through all of the blend shapes applied to your face and isolate each of the blend shape's effect on each of the control points, and create new blend shapes for each of these control points. It will then delete the initial blend shapes you made for the face (because now all of the blend shapes have been cut up among the control points), and create attribute to turn each of the blend shapes back on by instead adjusting the weights of each of the control point's blend shapes all at once (for example if you had a smile blend shape, you would now have a smile attribute on your face that when set to 1 would turn on the smile blend shape for all of the control points thus turning the smile back on).

Finally the system will automatically generate 6 additional blend shapes for each control points representing positive/negative translation of the control point's verts in X, Y and Z directions. When this is complete the system will connect the XYZ translation of each control point to the corresponding power of the XYZ blend shapes for the controls.

With all of these blend shapes automatically reapplied to your face mesh, you can now turn on an entire blend shape (for example a smile) by setting the associated blend shape's attribute to 1 for your face, with the added benefit of being able to select the individual control points on the surface of the mesh and both move them in 3D space as well as adjust how much the current blend shapes affect those control points (so if the corner of the mouth is giving you issues when mixing a smile blend shape with a mouth open blend shape, and you had a control effecting the corner mouth verts, you could grab the control and tone down the smile blend to remove the problem, without effecting the rest of the face). You can now also, if desired, parent these facial control points to a 2D gui as well.


SOO... for those that actually read through all of that... would this be useful? I dont claim to know everything about Maya, so maybe Im way off and something similar to this is alreayd possible, but I dont know of any easy way to accomplish this without manually doing all of this blend shape splitting / connecting yourself. Any advice on whether I should attempt to program this or not would be greatly appreciated, as I dont really want to waste my time working on it if it already exists or wont be appreciated by many people.

Thanks,
- Adam Eisfeld

thematt
02-15-2011, 06:46 PM
of course it would be usefull!! if you feel like doing it, I clearly encourage the inititative :)
cheers

nottoshabi
02-15-2011, 08:17 PM
Great idea Adam. Something like this would totally be useful. What I would recommend is building a tool box that would help speed up the process of building the rig some of the ideas you have are great. Some people have different pipelines of building these complex rigs and always remember that things always change in a production. Adding and subtracting elements always happen. What if you build in just joints or blndShapes, or wire deformers? And you need a quick way to dive in and rip stuff apart and put it back together. I don't think you can code for all those possibilities. I'm not saying you should not I'm saying you should build a tool for one way of doing it and a different one for a different system. If that makes any sense? Over the ears I have build my self a tool box of different scripts and set ups that help me do certain things to speed up the process. Think of what takes the most amount of time, and try to create a tool that would cut the time in half.

For example:
Blend shapes for me can take almost 80% of the time it takes me to build a facial rig. The other 20% is connecting it all together and adding the finishing touches and all that extra controls to the geo. So I build my self some tools to help with that.

To help build blendShapes I build a temp rig made out of curves and use that to pull a face shape. Then disconnect and adjust by modeling the rest of the way. On some shapes like Uhh and Ohh it really cut my time in half. Then I have controls pre made of all sorts of shapes and colors and sizes that I just import change the name. I also have pre coded a UI window I just adjust the names and go. Adding the cluster controls is a bit of a pain cause I have to paint the weights by hand. But with smooth it goes by pretty quickly. Adding the sticky lips, well that gets sticky cause I have not found a way where I can select a bunch of edges and create locators attach to the right edge and connect to the controller by mel. So I have to do all of that by hand.

AdamEisfeld
02-15-2011, 08:29 PM
Thanks for the input, to both of you.

Im a little rusty on my MEL but Im trying my best to work my way through the script and get a beta version running hopefully by tomorrow night. To simplify things a bit for myself Im cutting down on some of the automation and flexibility to begin with, and then Ill enhance it from there after I have a proof of concept working.

So right now Im aiming to have something along the lines of:

- Import your base head shape
- Import all of your blend shapes / apply them to the base head shape
- Create a controller (could be a nurbs curve or polygonal object) and position it onto the surface of the face where you want the control
- Click a "create control point" button
- With the new control point created, select a vertex on the face to be the main vertex this control point will effect / follow.
- Select surrounding verts by hand and add them to the control point's "affected verts" list.
- Click "Build Control Point"

Clicking build control point will then start the automated process of the script running through all of the blend shapes applied to the selected mesh, and cutting them up to share with the control vertex. Any verts on the head mesh that aren't controlled by a control vertex will be left alone, any that are will be zeroed out to remove their blend shape transformations, as these transformations will be shifting to the individual control points instead.

With that in place the system will then continue to build the additional 6 XYZ blend shapes for the control points.

Next up the system will add attributes to the head mesh to control each of the initial blend shapes the user created before running the script. These attributes will be connected to all of the control points to turn the shapes back on and off.

The script will then create an expression that will a) constantly orient each control point's CVs to their associated main vertex (so the points follow along with the face as it deforms) and b) cause each control point's affected vertices to move in 3D space via their blend shapes in relation to each control point's translation (clamped between -1 and 1 on each axis).

If I can get this base system set up I think Ill be in good shape to further work on it to add things like soft modification and whatnot.

AdamEisfeld
02-15-2011, 08:45 PM
One question I do have however is this:

All of these kinds of setups, be it wire deformers, blend shapes, using clusters on the face, etc, all of them come with the problem of not being able to move the actual deformation controller with the surface of the geometry. What I mean is that, say I have a wire deformer set up for the eye brow of a character, and I also have a blend shape set up that makes the character's eyebrow push down into an angry shape (this is just an example). Now, using either of these on their own would work fine, but if I were to turn on the angry brow blend shape to cause the eyebrow to push down, the wire deformer would not follow along with the surface geometry of the eyebrow as it pushed down, correct?

Essentially the entire point of my system at its most basic level is to accomplish this, the ability to have multiple deformers interacting but still have the deformers follow along with the surface of the mesh. Currently the only way I can imagine doing this is to have the control vertices of the control point follow along with the surface of the mesh, as this will cause the visual representation of the control to stick to the face, however wont cause the controler to actually translate causing any additional translations. Is there some other way to accomplish this that Im unaware of?

nottoshabi
02-15-2011, 09:26 PM
One question I do have however is this:

All of these kinds of setups, be it wire deformers, blend shapes, using clusters on the face, etc, all of them come with the problem of not being able to move the actual deformation controller with the surface of the geometry. What I mean is that, say I have a wire deformer set up for the eye brow of a character, and I also have a blend shape set up that makes the character's eyebrow push down into an angry shape (this is just an example). Now, using either of these on their own would work fine, but if I were to turn on the angry brow blend shape to cause the eyebrow to push down, the wire deformer would not follow along with the surface geometry of the eyebrow as it pushed down, correct?

Essentially the entire point of my system at its most basic level is to accomplish this, the ability to have multiple deformers interacting but still have the deformers follow along with the surface of the mesh. Currently the only way I can imagine doing this is to have the control vertices of the control point follow along with the surface of the mesh, as this will cause the visual representation of the control to stick to the face, however wont cause the controler to actually translate causing any additional translations. Is there some other way to accomplish this that Im unaware of?

LOL. Welcome to facial rigging its a cluster $%^& of a job. Its challenging and exciting and its totally fun, sometimes. LOL. As for the moving controls as the face deforms. You have to use a dummy geo and attach the controls to that geo. Thats why in my first post I said do tools that help you create one aspect of the rig then another and another. You will not be able to create an auto rig like we do for body's because you have to deal with the geo directly. And all geo is different, as for body's a biped is a biped. It all depends if you want stretchy arms or not, ik fk switch of not. As for faces is totally different. I recommend
reading reading this http://forums.cgsociety.org/showthread.php?f=54&t=20832 before you do anything, or even think in a facial rig direction.

Good luck.

AdamEisfeld
02-16-2011, 03:34 AM
Just a quick update, Ive figured out everything I needed to get this system working, and I did a quick brute force test to make sure. From my tests I shouldnt have any problems, Im able to apply base blend shapes to a face model, then create control points (out of polygonal or nurbs objects), attach these controls to verts of the face, have the controls follow along as the face deforms, and have the controls be translated to further deform and tweak the face.

However, Im writing an external program to handle the majority of the work. I need access to the individual verts of the mesh along with some other features. I know some of what I need to accomplish can be done via MEL, but the time spent trying to work the MEL code out with my rusty background in the language would be a waste when I could write an external 3D program to handle it in a language Im more comfortable with.

So, this is how Im planning on having things work:

- Launch my external program
- Import your base face mesh
- Import a blend shape representing the deformation you want the control point to apply to the mesh when it is translated along the surface of the face (for example, the left corner of the mouth raising up)
- The program then automates the process of creating individual XYZ blend shapes for the imported blend shape
- When the program is done creating the blend shapes, the control point is placed on the vert with the largest difference in translation between the imported blend shape and the base face mesh
- The user then simply specifies things such as which axis they want to enable for movement of the control (for example, you could limit the control to only translate up and down for something like the previously mentioned lip control), and clicks an export button.

Upon export a file is generated, that a simpler MEL script Im writing can then read in and work from the data to create the necessary controls / blend shapes within Maya automatically.


Current Progress on the external program:

Currently I have programmed a 3D viewport with camera controls to allow the user to rotate, pan, and zoom around their model. Im now working on importing .obj models into the scene (Im working with a DirectX wrapper, so if I cant figure out how to import .objs Im going to have to stick with the native direct x .x model format, meaning users will need to convert their models to .x files before importing and then reconvert to .objs upon export, however I think Im close to getting .objs to be able to be imported so no worries).

Hopefully I can finish this and it'll be of some use!
- Adam Eisfeld

AdamEisfeld
02-17-2011, 01:35 AM
Update: (Almost done!)

Well I scrapped my initial work on creating an external program, mostly because I was having issues with loading in .obj files (the structure isnt too complicated but I just dont have the time to keep fiddling with it), but also because I figure this would be more useful if it can all be done inside of Maya.

Im very close to finishing the first version of the script, currently this is what I have done:

- Import a base mesh
- Create a blend shape for this mesh representing the area you want to place a control on and the general movement of the surface you want the control to... control :P
- Create a control (can be a mesh or a nurbs object)
- Select the blend shape, then the base mesh, then the control
- Click a button and the script automatically creates 3 blend shapes from the original blend shape representing vert movement on the X, Y, and Z axis respectively.
- The script then performs the necessary grouping / constraints and creates the expressions for the control and ties everything together so you can now move the control around in the viewport and deform the mesh, with the control sticking close to the general surface of the mesh (even when other deformations are performed on the mesh, so you could hook a blend shape up that opens a characters mouth, then have one of my controls on the lower lip of the mouth, and the control would follow along as the mouth opens, while still giving you the ability to translate the lip control on the surface of the face to further fine tune / deform the lip itself).


Oddly enough... Ive been doing a lot of testing and it seems that my system fails if the blend shape you select has had it's vertices moved around with Soft Select turned on. Im not sure why but Im looking into it and hopefully I can work out a fix for this soon.

Other than fixing the Soft Select issue and cleaning up the code a bit, all thats really left is to create a more intuitive GUI and allow for better customizability by the user and I should be done the beta version of this script.

- Adam Eisfeld

CGTalk Moderation
02-17-2011, 01:35 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.