Shader/lookdev/mymentalray


#5

JESUS CHRIST there has to be a better way!..

Julian…Miguel…HELP!

I got the bright idea that it would make a nice feature in the book …if the texture aritst after completing the highly detailed map …eaisly whip out z-brush…make a new layer for high frequency disp…and gently inlfate the precious surface details from an alpha mask…

until I tried to export the ****ing thing…

  1. I get the alpha depth factor…to correct for the gamma (because z-brush samples mid-gray displacement range)

  2. I use some ****ing thing called the Alpha Displacment Exporter …to put it into 32 bit disp…that maya likes…typing in some bullshit serial code…(my porn site has less protocal than this)

  3. Then I set some MR node for the subdivisions…

a couple other steps…and Im praying that something renders…

by this time I need a drink…

****ing A…

need some help…there has got to be a better way…this used to be about reading a ****ing black and white map…how hard is that?


#6

hey paul,
that’s a great idea you have. definitely do this! now, by the time you get to that section of the book, mudbox2 will be out and this won’t even be a problem (you can do this in mudbox now, actually, but mb2 is already proving to be a big upgrade)…i know i’m a mudbox slut.

so the workflow will go somefin’ like this:
i’ll send you my highest res mudbox file, you can apply your textures on to it (preexisting or not) add another level of subdivision or two (depending if you’re workin’ with heaps of ram, like 8 gigs and a 64 bit OS), use your greyscale map to displace the mesh/carve in, then inflate your tertiary level skin pores and wrinkles, etc. for that last added sweetener. when that’s all done, we can extract the displacement maps from mudbox2.

NOTE: i have yet to extract displacement maps from mb2. i’m pretty sure it’ll be good. last resort is we extract 2k 32bit floating pt displacement maps from cyslice. i’m wishing on mb2 being a one-stop-shop for sculpting and extracting…but there’s always cyslice, which does a beautiful job extracting maps.


#7

**** I hate Z…

I didnt know you can take a map and pull out the highfrequency detail…from the map…in Mudd box…

I will upgrade my box to 64 bit…just to do this…

we should really stick it to ****ing Z-brush…Aaron simms can have it…

lets promote Mudd!

any documentation on how to make a mask of a loaded texture map and pull out a disp? (in mudd?)


#8

zbrush is not that bad. But either way the mental ray problem is a pain. I never use the mr subdivision node thing never could get it to work. Also the fact that most machines I’ve had couldn’t handle such high displacement. I prefer to use a technique I found at headus ages ago. I either export a lower subdivision level out of zbrush or what ever package ur using. For example if my mesh has 6 levels I’ll generate a displacement map and normal map from level 3 and export this level 3 into an obj for rendering in maya. Seems u can get faster renders if in the imported level 3 mesh in maya u take off feather displacement and let maya only push the points according to the map for silloete of the model and let the normal map render all the missing fine detail that would otherwise cause maya to go nuts tessalating. Maybe not the best method but works great for stills and speeds render times greatly and comes very close to the original sculpt…


#9

Do people have all these problems in Muddbox?

I pretty much gave up on 32 bit disp out of Z-brush…

  1. Do you do anything for the Gamma correction? (-2.2 1.1 rule?)
  2. For 16 bit maps do you have to do that converstion thing?
  3. For 16 bit maps do you have to do a node correction for subdivisons? (I remember this being easier years ago)

#10

as far as the high frequency detail? its your models…your croncepts…so …tell me what you want to do …where bump and disp meet…

It would be much easier to keep highfrequency in the bump…no? …do what you guys do and I will follow your lead for what creature…

Paul


#11


So I’ve been out of town on travel and I meant to put this up about your mental ray rendering problem. So as I said before I never use the subNode thing MR gives you to approximate the mesh for rendering. Instead I’ll bring in a lower rez of my mesh in this example it was level 3 for the base to start my rendering on. Already since I’m using a base from the original sculpt I have some silhouette info there so I won’t have to displace to much of the mesh. First I’ll take feature displacement off in the mesh so it won’t cause maya and mental ray to tessellate. This will only push your vertices out depending on the displacement map that you provide which is really all we need. Also I just used an 8-bit displacement texture; it seemed to work just fine.


As for the displacement map texture its self needs the correct alpha gain number so you don’t push the vertices way to out. Usually I get this number from zbrush when it makes my displacement map. But the formula for this goes as this your main number goes into the alpha gain and in the alpha offset is the same number divided by 2 and it’s always negative.


Here is a quick example I made with this method I like to use. Which by the way its origins come from here Headus Example it goes into much further detail and has examples to download to see the shading networks. This is the lower base mesh which I simply subdivided once in maya with a subdivision node and the poly count is 34,272 polys. The left side is this base level plain on its own, and the right is which the generated normal map, which we can say will deal with all this high-frequency detail we need from the meshes, plus it still uses the displacement map to push out the silhouette to complete the reconstruction of the scuplt. This renders much faster than trying to let maya subdivide the mesh for displacment and captures those really fine details and still gives room to include more bump maps or normal maps if needed for other fine details all handled in the hypergraph shading network.


High rez screen snap for compare. I tried to render it in maya but of course mental ray crashed on me. This sculpt is 548,352 polys, compared to the top 34,272 polys. Your always gonna lose some detail, but not to much.


Screen snap of the shading network, its very simple based shading network, mainly the normal map is capturing the fine detail…


#12

Thanks Miguel! will give it a shot this weekend…btw Joe alter hasnt gotten back to me yet…Daniel is trying to get a hold of him…

pf


#13

No prob! Hey thats cool, guess still not needed yet. Models are still being developed. Maybe we should have backup ideas for hair in case that doesn’t go through…

Speaking of, I was rebuilding some topology on a model of mine and I was wondering if anyone is gonna do a section in the book about retopology. This might make a good section. I’m sure that stefano would agree with me in saying that using the NEX plugin for maya to rebuild topology is pretty sweet!

BustRetopNEX <-- slow but you get the point…

Some other tools of interest…

:Normal Mapping:
Photoshop Normal map Filter
Xnormal
Crazy Bump

:Maya UV Tools:
RoadKill
Zebruv
Pelting Tools


#14

ah we will get license for shave or i will freaking buy one (or Daniel…can ya help a brother out)


#15

Maybe we can pass a donation plate around and buy it…

Found this free screen capture software for anyone interested, works pretty well.
Cam Studio


#16

Good Eye color reference images for look dev…
Eye Ref

Here is also a siggraph video detailing a method for eye rendering…
Eye Rendering


#17

Joe ALter has donated a licence of shave for us…!

Big thanks Joe…

(We need to plug him)

Miguel …Daniel will hook you with license…

paul


#18

that is great news Paul! Definitely we need to plug him… Thanks u guys rock…


#19

quick question about muddbox…(was playing around with it this weekend)

I know MB can make a stencil from a photo…a free floating mask (like you would in real life airbrushing)…

but

can it make a stencil derived from a UV map?..ideally load a texture and pull and alpha matte of that?

can MB 2009 pull this off?


#20

Sounds like your trying to pull that zbrush trick where you take the greyscale bump and inflat the mesh for details? I’ve used mudbox to a certain point but from my knowledge I don’t think you can alpha mask from a texture using the uvmap coords. There might be some sort of workaround of trick to do something similar, or the new 2009 might be able to… Although I’m pretty sure it might not be able to do that. I could be wrong!


#21

I was trying to do exactly that…

we are doing some crazy rock/earth stuff at work…

I would take my bump…inflate…and prep a high frequency displacement on low level mesh…

the z-brush artist could blend his deep sculpt with the hi-frequency on steroids through a morph target…literally picking what was dominat in spots…some spots like dirt would have a high frequency grainy dirt leaning towards bump…and the deep rocks and chunks of dirt…the Z-brush sculpt would be dominant…

In essence it gave the modeler the control

inturn I would blend a hint of cavity map in…to marry the deep sculpt to the bump (ish) scuplt…

all of a sudden you could have a complex colormap…and harmonize it with z-brush sculpt…


#22

Nice… this kind of workflow would go well with some kind of procedural rock/earth shader in maya pumping out the textures. Since allot of earth/rock organics can be mimiced with procedural textures, You simply need some sort of mask map to dictake where you want certain procedural textures.


#23

Need to see this example of procedural dirt…

in the movie we have large jutting peice of earth that come shooting up (Kind of like the crystals in Superman)…(camera Hero)

We gave a broad range of dirt samples…ranging from various topsoil (with roots, pebbles, and rocks embeddeed) , blending that into compressed clay, which then turns more bedrock…which we could art direct to the director’s desire…

Procedural dirt looked OK from a distance…but we have Dylan Cole doing spot matts for the wides…and anythign close to camera gets my maps on it…(which is dead tight down to roots embedded into dirt)…

I know programs like geo control, terregen arent bad at distances…would love to see a good example of the procedural shaders…

…my name is mudd


#24

sounds cool, you would think that procedural textures would look good up close. I do agree from doing a small test it would take much more R&D to fully realize a full blown procedural solution. Although I’d rather use it as a base to start from and then mix in the other techniques your talking about. Maybe a proceduarl tweak to start a base texture and maybe that zbrush trick along with plane old modeling and even some tileable texture samples could give quick dirty solutions. Yea geo control and terragen are those kind of programs that simply look good from a distance, but I’ve seen programs like dark tree textures and even maya’s hypershade procedurals to produce good results when mixed and tweaked properly. Plus if you guys have a super nerd that could program even more procedurals textures, ala like renderman… I did sit down and do a little test here is the result, not the greatest, would need alot more research and sample matching, but you get the point…

Here is the hypershade view of the shader setup, it could be allot more complex. This was just a quick and dirty example…