Technique behind the Neckling


Finally I got around to finish up this post to present to you the techniques behind the Neckling. In this post I keep it closest to the m:studio2.0c part of the Neckling, if you want to see more about the ZBrush involvement have a look at my post on ZBrushCentral.

For more images of the Neckling visit my website:

First a look at the Shaderflow of the skin:

As you may notice, the shaderflow itself is very simple and consists only of three major sections. The displacement section including the ZBrush displacementmaps and “texturedeform” with its attached imagemaps. The skin shading done with the “BasicShader” with a color map and a camera fresnel attached and additionally the “Anisotropic” highlights added.

The first section of the BasicShader is very common and besides those typical parameters like diffuse, specularity, glossiness and so forth it has a additional tinting parameters to control not only the tinting of reflections and highlights, but also the tinting and sharpness of radiosity. The skin shader has a camera fresnel on the tinting of the radiosity to simulate more convincingly the oily properties of skin by reflecting more of the original color of the surrounding at glancing angles.
The second section of the BasicShader deals with translucency. It has several different modes, starting from linear shading (ordinary) over powered shading to extended shading types (volumetric translucency) up to even SSS rolling in radiosity for fast performance.
To explain the “shady” nature of all of the Xtended modes I prepared a little sheet, going through the parameters with very brief descriptions:

Obviously the most unusual element of the Neckling are the animated displacements. With the coming release of the new Studio2.0c several new features will find there debut, amongst them is: “TextureDeform”!
TextureDeform allows to manipulate textures, image maps as well as procedurals by either deforming space, relocating space or even remapping space with another addition that even generates it’s own textures specifically designed to work with skeleton systems.
In order to show you some of the most significant uses of TextureDeform on the Neckling I prepared a little step by step sheet…it’s rather coarse, but it should bring the basics across:

To render these quick frames I reduced the radiosity’s GI-samples to 4 and came to a rendertime of 32sec per frame, including the full shading. Pentium 4 - 2.6ghz, 2Gb RAM.

Just to mention it, in case you havn’t seen any other posts of the Neckling, what you are looking at are 340 polygons, displaced at rendertime, maps generated in ZBrush2.0.

This should give a good overview over the messiah section, but feel free to ask me, if you’d like to know more specific things.

I’m excited that so many people are realizing the power of all of this, helping me to believe that I wasn’t insane after all when I started dancing after the developement of TextureDeform and the discovery of the endless power between ZBrush and messiah:Studio.





Thx for sharing your knowledge, thats… well very cool


hi taron,

whats the min/max shader work for in the bump map?

what is the work flow of your bones?

(how can i send you a private message, it say you are full, can i email you some where?)


Hey Pelos, I think I received an email from you, wasn’t that…I thought I replied, excuse me, if I didn’t…I cleaned up my privat messages…would surprise me if they were full again!

However, min/max allows to remap 0.0 - 1.0 values to anything you want. In our case we want the displacement to go out AND in, so we remap it to -1.0 - 1.0. This puts 50% gray to 0.0 displacement. Black becomes -1.0. White becomes 1.0. Since every node in a shaderflow takes time to compute, the min/max node has ten independent channels to take care of nearly more remapping than anyone could need without adding another node. To save even more time each channel can be deactivated…it’s quite nice! :slight_smile:

I don’t understand the bone question. I add a skeleton system and then joyfully bone away. Depending on the current needs I came to a bunch of different approaches for bone-setups, but because it is so easy and flexible in messiah I love to experiment around from time to time…the fully integrated muscle bones really added another level of convenience to set up the most wicked constructions with great ease. That is particularely great ever since the TextureDeform…setting up one side with whatever amount of bones and simply mirror it with the targets mirroring automatically, too, is a very comfortable power to have. I had a total blast setting this guy up and it took roughly half an hour for it to be completely finished and I mean done-done! :smiley:

With the TextureDeform it also becomes a lot more pressing to make use of the groups, which work gloriously. It’s actually a total blast all the way through I have to admitt, because it feels so neat and clean. Despite the impression that wicked little image that shows all the bones at once may make, it actually is the easiest thing in the world to weed through them…eh…wrong word…no need to weed through them…just select the group and nothing else gets in the way. A good naming is recommendable however…on this one I have to step up and down sometimes to make sure I’m on the right bone…but then, once the muscles are rigged for the shading, you kinda don’t have to touch them anymore anyway…so that’s twice as cool…no problem if you have to, but you don’t need to…ha! :smiley:


You have quiet the beasts… I now soak it in.

Thank You … JediMaster


You should put thison the messiah website and when you render your images, is radiosity always on?


Thanks Taron, i think i can mimic the neck muscles with maya and Prman now. The technique seems really nice from a skin sliding point of view, in this regard its effectiveness is tied to how the muscles are linked to the rig. The texturedeform map can probably modulate this.

However my concern is that the look of the creature is once again being split from zbrush. The great thing about zbrush is that we could model independantly of the topology that is gonna be animated, however with the texturedeform neck setup, the zbrush sculpt cant take into account the neck muscles as they have to be added later.

It would be nice if somehow we could somehow derive the neck muscles from fully zbrushed head.


Great stuff Taron! Glad you are on the pmG team. Looking forward to 2.0c.



Thank u very very much for the detailed explanations :buttrock:

As soon as i have some more free time i need to start experimenting this stuff in Messiah and really get into it´s render and u are making me wish to have lots more free time to play hehehe :smiley:


thanks you taron for this amazing explanation !!


You have inspired me more than I can ever express. Thanks for all this great work and explanation.

Mark Johnson


I have a million questions. . . Taron do you think you and pmG could include this scene with the release of 2.0c? Your explination is great but there is nothing like playing with the settings yourself to really understand what is going on.


this will be great i’m a new user of messiah and study a scene like that will be great !

(sorry for my english)

#14 :slight_smile:


:scream: great !!!

Merci pour l’info maks ; )


Hehe, de rien :slight_smile:


I e-mailed you but I don’t know how often you check your e-mail so i’l ask it here aswell. I am very interested in this technique to apply it into gaming at a realtime level. Do you think it produces a big hit in recources? I am looking into 2007/8 DX 10 hardware and 2048x2048 texture and displacement maps. I don’t know how mutch you know about it but if you can share anything it would be greatly appriciated. It greatly resolves the problem I am trying to solve for a while now (how to animate textures and normal/displacement maps to give the effect of muscle tension when you move your arm or chest and wrinkles in faces when you laugh or frown). with an advanced bodylanguage and facial animation system it can produce stunning animations that would look verymutch alive. Imagine bones that have this that can be realtime manipulates by physics for example belly’s or tentacles that creat folds on the way it bends, or steal pipes that are bend outwars when hit by something heavy creating realistic looking metal folds. It can really up the level of realism and detail you could apply to characters in games.

exiting technique that has in my oppinion quite a bright future


Didn’t understand a thing, but than, that’s me… lol.

So you work out in seperate parts like the adams apple, and export only that specific part from zbrush, and then overlay it onto a proper part of the UV??? this is way beyond me. hahah…


He basicly connects the new map it to the bones so when you move or stretch the bones in relation tot he static ones, it gradually morphs the pixels of the original map into the new map wich gives the illustion the neck was giving tention to the muscles on the left side when you tilt your head to the right

correct me if i am mistaken taron


Hmm, thanks… Is it a “morph” or just a fade… If it is a fade, it could be emulated via Expressions in lightwave…

-I also wondered about the isolated displacement maps out of Zbrush, are they really in separate parts, or they cover the whole model with the exception that everywhere other than the displaced part is 50% gray…