Technique behind the Neckling


#26

I know taron has said the possibility of this becomeing a tutorial to play with,would this be a newbie or advanced,would this involve a the balnk head stage,the rigg stage then the application of the material.

It would be nice because it would change a decision to upgrade to messiah 2 from animate 5.

Of course i hope someone will review this tut if and when it arrives.

Thanks


#27

I guess thats one of the advantages of having the render engine linked at the basic level to the geometry engine. I.e. that you can access the neck muscles when displacing the skin.

Trickier to implement in something like PRman where you only have access to the local surface data.


#28

Great work as usual Taron.

Thanks for breaking down the Basic Shader properties. Very helpful and enlightening.

You should post a before and after pic of the mesh for those who have not seen it. :slight_smile:


#29

This work is simply breathtaking.

I thank you for the detailed explanation of your technique. Looking forward to more of your experiments, studies and findings.

Truly inspiring.


#30

Thanks for all the information :smiley:

Yay… I watched the movie at zcentral… Amazing!

@Taron: One little suggestion tho: You see the skin slide over stylomastoid muscle and adams apple perfectly, but when closing the eye, the eye-rims move down along with the eyelids… It would be really nice if the eye-rim (created by the the frontal bone of the head) stayed put and only the eye-lids moved, and the skin slid over the eye rim. Maybe if you just added some polygons over top of the eye, this would be achieved easier, but I guess you over-simplified the base mesh to emphasize the power of displacement mapping…

@Evil: Do you know for sure DX10 does displacement for realtime-on-die subdivided vertex??
Currently at DX9Level ATI does support subdivision, but it is much too limited (skinning has limitations, and vertex shaders cannot do any texture fetches), Nvidia’s latest hardware can do texture fetching in vertex shaders, then I don’t know if it would displace the on-die generated vertex , but it would only displace the main vertex I -suppose-. I don’t know about directX10 specs tho, if it would require a hardware that can displace subdivided surfaces. Maybe it is better to try things with manipulating normal maps for facial expressions.


#31

Taron, do you think that render shaders on polys would work or killrender times in they were to be be baked. I know environments in games and architectural renderal rendering have improved because of baking but whatabout SSS, would that help. Maybe 4 images map that would alter themselve according to the angle of the camera and blended with a nice gradient. So you woldn’t get a constant lit side. Just babbling!

and lets not get started on normal baking.


#32

Hey hesido, you’re perfectly right!

Hey Julez, baking SSS would be just as useful as baking radiosity, you’re perfectly right. As long as people don’t bake moving objects there should be no problem. SSS is very sensitive to lighting changes, more then radiosity I would say, because the impression it makes really depends heavily on the lighting situation, while radiosity is a bit more forgiving due to it’s spread. But that’s a really funky statement…in other words: Sure!

Thing is, if you have a radiosity lit scene and shine a light across it, it may still work almost fine, but an SSS object would totally not work, I’m afraid.

Thanx for all the replies, I’m really happy to see such a reception. That’s better than what I could have wished for…great! :slight_smile:


#33

Taron, is SSS generated by lightsources only, or by irradiance AND lightsources ?


#34

Panikos, hey, the ordinary “Extnd” lights (volumetric translucency) only gathers info from lights, but the Extnd radiosity not only incorporates irradiance but actually reads the scattered illumination including lights as well.

So: YES (if you chose the proper mode ( Extnd radiosity ))

The obvious drawback is the additional rendering time, although it is stunningly fast considering what it does. I also need to give it some more attention eventually to add even more possibilities as I can think of plenty of substancial responses that would make it even more attractive. Although I do love the simplicity we’ve got with it now. Results are super fast and control is extremely comprehensive.

NOw, I shall wake up and start my day… :wink:


#35

Taron, thanks for the reply.
/me excited :bounce:


#36

Bounce… I am covering the extend lights now


#37

Great technique. Can this be tranfered over to Max?


#38

I wish Xmas will find us all very happy with 2.0c
Waiting with excitement :slight_smile:


#39

Ambiguous workding – You’ve asked two different questions!

Understanding that I’m a Max user with no experience in Messiah, take these answers with a grain of salt…

1 - Can the same techniques be used in Max instead of Messiah?

Depends. Are you a scripting god? It’s doable, but it wouldn’t be easy. If your time is worth money, you’d get better value from buying Messiah.

Plus, assume that something enabling this will be built into Max 8, and weigh your efforts now accordingly – is it worth your time to pre-invent the wheel? Likewise, understand that the Max implimentation will be less intuitive to use and set up, whether you build it or Discreet does.

2 - What of Messiah-as-Plugin? Can the animation you’ve created be brought into Max?

This requires either a material plugin (reading in Messiah’s displacement at every frame), or that your dynamically generated displacement itself be rendered to sequential images and applied in that form. Assuming one of those abilities is provided, it shouldn’t be too hard.


#40

Very cool technique.
anyway you could use the same technique in Maya?
I am not familiar with other software.

cheers,
chia


#41

Guys, try to understand that you’re asking one of Messiah:Studio’s developers how these techniques he’s pioneered and personally built into that software can be replicated in a competing package. You mean well, but that’s a little disrespectful.

If you want to reverse-engineer this in other software, it’s going to be a challenge. See two comments above this, and substitue “Alias” for “Discreet”.

I mean, you're welcome to try.  Just, don't ask for help with that in the pmG Messiah forum.  See "disrespectful" note.

(and again, if your time is worth money, it’ll be less wasteful to just buy a copy of Messiah:Studio –the workstation edition’s only $299, and I believe it comes with the Neckling work file for you to play with)


#42

Thanx AP, but essentially this type of question almost makes a better advertising for messiah:Studio, if you think about it. The reason for it is because of the difficulty this would mean to a Maya user, while it is incredibly easy to do that in Messiah. After all I’ve learned about Maya, it would be actually impossible for someone, who doesn’t program plugins, because MEL script can only go so far and even if someone would figure a way to do it with mel, it would be incredible slow and most likely unbelievably uncomfortable to hook up. In Messiah it is all part of the way it works and therefore extremely natural to hook up. No fancy weired steps that require extensive research into several directions, but instead you can simply use all you arleady know about the works (Bone deform and the Shaderflow…that’s all)…and more so you can take this and go how ever far you want to go with it…your creativity is what opens the gate to, well, anywhere you wonna be, anything you wonna see! :slight_smile:
Well, that sounds a little grandious, but essentially it’s not wrong at all. The concept is that everything’s equal and therefore you can use anything with everything, and if logic permits (sometimes even beyond that, hoho), you can just do it. Yeah, yeah, of course that just goes for what we’ve got in there so far, but there’s more to come in a few weeks from now. I have to finally get my act together regarding tutorials and demos and such, but I’ve actually been working on projects (music videos and commercials) with messiah:Studio, which turned out instantly successful and are yet another proof that we are on the right track. All of those projects didn’t even require me to write anything new, which also bothered me a little bit, because I really love when new things come through a direct purpose. But in the meantime we actually began to finish up another great section, which I will keep a secret for just a bit longer…hehe…it’ll be loads and loads of fun and I can’t wait to put that into the next projects as well…you’ll see…it’s beyond characters, that’s all I’m gonna say. :smiley:

However, for our Maya users: All you need to do is create some kind of handle (like a bone for instance), make it a muscle bone (with a target to point to and something that maintains the virtual volume of that handle), blend between the coordinate systems of handle and target and map any texture to it and have this effect the displacement. Then you got it! :thumbsup: …kind of…then it’s a bit more blending and a bit more finesse with the mapping principles and the combining blending between instances and of course instancing itself as well…but yeah…that’s basically all…took me three days to figure out and finish up. :slight_smile:


#43

Hey Taron, will the new basic shader be part of the next rev or stand alone.
I know that last one had some boo-boos.


#44

Hi Taron,

this is some of the most inspiring work i’ve seen.
i come from some other SW bckgrnds, and lack of readily avalible SSS is a major drawback.
it seems everyone is behind m:S when it comes to implimenting new inovations in animation/rendering.

thanks for sharing all the knowledge details nad concepts, i’m sure TARON is gonna be one of the major keys bound to messiah:Studio’s increased popularity in the near and (i’m sure) far future!


#45

Anybody know how to do this for Maya?