WIP: Head, phonemes, animated


i’m almost finished now. i’ve added asymmetry to most of the shapes that needed it and consolidated some of the controls.

i’ve uploaded a .blend, if anyone wants to take a look.



Whoa! Fantastic job! I’m gonna play around with it later when I have more time.


if the .zip file doesn’t work then try this link: http://www.merit-display.com/blendshapes/Asymmetry_Head_Rig.blend

windows sometimes has problems with mac os x generated zip files.


Zip file worked OK.

I was wondering, once you have your facial rig set up as you do, do you then record the location of the manipulators for your animation or do you somehow record the actual morph targets?


in this setup, one would keyframe the location of the controllers. under the old system, before blender gained the ability to have ‘driven’ IPOs, one would have to keyframe the RVKs’ IPO manually.

it is still possible to keyframe that way, but it’s a cumbersome method of working, and the reason why there hasn’t been much character animation done in blender to date.


[left][color=white]before all sorry for my english[/left]
I working on a similar project
that’s why I could value really your production,
I have applied the driver directly on the face,
[/color]here you could see an example



gillan, this is quite interesting. it looks like a custom python interface. i would have no idea how to go about creating such an interface as my coding skills are nonexistent.

but i am interested in the use of objects on the face to act as controls. i’ve seen orange do this, but no real explanation of how to go about it.


no script
the script who made the shape driver do a set of operations that they could be done directly without script too.
you select your mesh
in the ipo window
you go in the “shape” section
select a keyshape
press “n”
in the panel look
ob: it is the object that drives the target (you now could plan any object you you want)
loc: the axis locked
[min] and [max] the entity of the effect - now will be 0.5 or 2 too and not only 1.
you look the shape of the ipo too
now you will create the driver that you want (models them directly as mesh)



This is very good! How do you get the min max to show up. I tried the I-key but it did not show the min max?


in ipo-shape window select your key shape
press n

click “add driver”
press i

select the curve
press tab and set the min e max value

another tips
I have understood it’s better use the bones instead of the objects or the shape widget
in this way you can use it as action in the not linear editor.



What are the green and red cross objects that you are using to change the face while in object mode? How do you set them up?


objects I modeled previously



Would you mind sharing your blend file. I’d like to look at how you set up the facial rig.

In particular, I’d like to know how you set up your green and red objects to only effect certain vertices as shown in your video. Are they hooks or just IPO drivers? They act like proportioal (O-key) in object mode.

TroutMaskReplica - I hope you don’t mind me asking so many questions in your thread of gillan. I’ve learned quite a bit from both of you so far, I appreciate all that you are sharing.


i’m enjoying the discussion. most of the questions you are asking are ones that i’ve had myself but haven’t had the time to post about.

i tried setting up on face controls, but the problem i have is, say i have a controller for the jaw_open shape, and i put that on the chin, and i have another controller for lowerLip_Dn, and i place that one on the lower lip.

now, if i grab the jaw_open controller and move it down to open the jaw, the lowerLip_Dn controller stays where it is, and is no longer positioned over the area it is supposed to affect. but in the video gillan posted, the controllers move properly. (edit:i guess i didn’t watch the video closely enough the first time, because i just watched it again, and the controllers don’t move).

i kind of solved it by vertex parenting the controllers to the mesh, sort of like pinning them to the skin, but the vertex parenting seems to be incompatible with my armature, as after i did this the eye bones no longer had any effect on the eyelids.

i have a feeling this is not actually the best way to do this.


only drivers! the project on i working is not only my, but I think at the end all he will be shared

you look better!
I have yours same problem

I trying to resolve it with the bones (es. the jaw_open controller is parent of the lowerLip_Dn controller )


aarrgh! finally had time to do a sync test. this is a ridiculous first attempt. nothing is right, but i thought i’d post it anyway to show this project is still alive.

the line of dialoque is from the ricky gervais podcast. (yes, i am going to redo it).

http://www.merit-display.com/blendshapes/Terrible_sync.mov (quicktime, 100kb)



Thanks for posting the blend file !

Fun to play with

Just wondering about the ‘lines’ that show up on the head in shaded view. Is there a way to turn them off ?



select the head mesh in object mode, go to the object subcontext buttons, deselect the ‘wire’ button.


Thx, I looked for ‘wire’ on the panels … didn’t look hard enough :slight_smile:



This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.