The right way to render a depth pass


Hey guys…

I working on a project at the moment and I am having trouble rendering out a decent depth pass. I looked up on the net and a lot of people recommended the “luminance depth” render layer preset.
It sounded good so I tried it out, but I am struggling to change the depth fall off (white to black) settings to customize the DOF, what ever value I try in every field doesn’t seem to do anything…
can someone please help me out with this, or
does anyone know of any good tuts on DOF
(using maya 2011)

many thanks


The luminance depth render layer is indeed very easy to set up and use. The new shader it creates actually takes the near and far clipping plane from your default camera. Thus, if those values are very large while the scene is very small, the gradient in the render will likely appear as all white, or with very faint gradients. Personally, I found that the best way to get the most values in the gradient is to change your camera’s near and far clipping planes, watch that they don’t cut off your objects, then manually replace those values in the Old Min and Old Max attributes of the setRange node that is connected to the new Surface shader. Obviously, you’ll have to break the connection first for those two attributes.


thanks for the advice…:slight_smile:

This DOF seems to be very linear…It only really goes from a focus point to the other focus point. How would I get the effect where I can isolate an area that I want to be in focus and everything in front of that area and every thing behind that area will not be in focus…? Is this possible with this shader?

many thanks


The answer is “Yes you can…” but just how easy it will be depends on your compositing program of choice.

With the data in your depth pass you can do all sorts of depth effects. Some compositing programs (or plug-ins) have a depth range…allowing for this foreground and/or background blurring. If your program of choice doesn’t have this functionality, you can probably still use it to remap the color values of the depth pass to give you a blur map that will give you what you are looking for.

Let us know what compositing program you are using and maybe someone can help you out…or you could check in those forums.


Hey there

Im going to be using Nuke as my composting app of choice, would you be able to guide me through your work flow for doing this in nuke if possible?\

many thanks


if you’re using nuke, you’re going to want a floating-point depth channel.

you can either use the render passes system in maya to create a “camera depth” pass and be done with it, or you can make a custom surface shader that outputs depth in floating point… 1cm = 1.0 luminance.

the shader is pretty easy-- create a surface shader, a samperInfo node, and a multiplyDivide node. connect samplerInfo.pointCameraZ --> multiplyDivide.input1Z. then set multiplyDivide.input2Z to -1. this is because cameras always face in the negative Z axis in maya, so when pointCameraZ says that something is -123 cm away from your camera, it will now return 123. then connect multiplyDivide.outputZ --> surfaceShader.outColor. that’s it. the render will look completely white but if you look at it in nuke and turn your exposure down, the values are all there, without any banding or clipping plane settings or anything.


Ahhhh…thats where was going wrong, I forgot that I should check the actual pixil data rather than just trusting what I see on screen…
Thanks for the reply, I will try it out as soon as im at my desk

Many thanks


There are plugins for Nuke that will make it easy for you to compile the DOF effect.


awsome, i will check that out…


Hmm I tried fiddling around with these but still no luck. It dosent seem to be a 2011 bug though, because opening an old 2009 scene everything works fine.

I tried breaking the connectcion to the sampler info and the multiply divide node (it still pops back to min 0.1 max 10000 when i reconnect).
Made a whole new scene with some primitives and spreading them out, scaling them up and down, and with different renders and settings, and I still only get a pure white (like an alpha), and no gradient. This is really driving me insane, because the old preset luminance depth used to work fine for, dunno whats going on.

Edit.: opened Maya 2009, made a few primitives and rendered with the luminance depth preset. It worked fine:shrug:. Strange that I can actually import a 2009 scene to 2011 and that works as well, anyone know whats been changed between those versions and how to fix it? I havent tried the other presets besides ambient occlusion (but that one worked fine in 2011).

I tried importing the working zdepth 2009 scene to 2011 and it worked fine, thinking I could outsmart Maya I tried importing my original scene to the 2009 zdepth scene, but as soon as I did that, the objects turned pure white again, with no gradient… Beginning to think that someone intentionally did this to annoy me:argh:


if you are using 2011, just add (and associate) a ‘camera depth remapped’ pass in the passes tab. There is nothing more you need to do.



Nick is right, there is a pass for that. You can set the near and far distance (which is why it’s called remapped) in the attribute editor. Note that the simple camera dapth pass can also do it, just have to check “Remap depth values”.

But since you need to render Zdepth twice bigger, with no anti aliasing, the best is to create a new render layer, assign a surface shader to everything (so the RGB render won’t take ages - since beauty is always rendered), override the render settings for resolution and sampling (fixed, 0), turn off the Framebuffer -> interpolate samples, and render with the camera depth pass in a 16bits or 32bits format, of course :slight_smile:


Thanks for the replys guys.

Nick: Hmm care to eloborate on what you mean/how to do that?

dot87: You lost me in between Framebuffer and 16/32 bit. :slight_smile:
I will try and google up on that a bit.

Thx again


In 2010 there is no right click menu to override ‘sampling mode’ to fixed. I’ve tried to do this a few times before and never gotten it to work. Do you use a piece of MEL code to override this since the normal right click functionality doesn’t exist for that control?


You want your Z depth pass to be floating points, so you have to render in a 32bits float image format (EXR for example). And you have to set it in the Render Settings, Quality Tab, Framebuffer, and select a data type, RGB (Float) 3*32bits for example.

And unchecking interpolate samples makes sure that your image is aliased.

Redsand : you’re right. The idea is to use Custom Sampling, set it at 2-0 for your master layer, and override values to 0-0 for your z depth render layer.

Sorry I didn’t see your message since you posted when I was writing :slight_smile:


So do you reformat you’re depth pass back down to regular res before or after you apply your blur node of choice? In my experience if I render out the depth pass to double res no AA then reformat it back down to regular res and then do the blur it looks horrific. The only thing I’ve been able to do is take my beauty render and reformat that up to the same res as the depth pass, do the blur, then reformat the result back down to normal res. It softens the edges of the beauty slightly but that’s the only way I’ve been able to get rid of nasty edge artifacts so far. Any other work flows for doing this would be greatly appreciated!


This always gets complicated. . .but it’s not really.

If you use the regular Camera Depth renderpass, switch off Filtering.

Render to at least RGBA 16 (half) to get the correct precision under Framebuffer in Quality tab. (EXR filetype)

If you need to know the depth you can open the render pass from the Maya Renderview. File>Load Render Pass This opens imf_disp. Select Layer>depth (whatever the image name is with “depth” appended.)

If you pass the cursor over the image you will get pixel values at the bottom. You can even change the Exposure to see the depth levels. If you drag your cursor over where you want focus, it will give you that exact pixel value you can use in Nuke.

Sidenote: Fcheck, when you hit ‘z’ on the keyboard will usually give you the depth near and far values in the terminal window as a little printout. However, I’m having issues with 16 half or 32 bit exrs. You’d think someone would rewrite Fcheck by now. . . .

I have no idea why you’re being told to render it separately or render it twice as big etc. It’s not necessary for correct Z-depth unless there’s a bug I am unaware of.

You don’t want anti-aliased z-depth because the pixel will contain “some” information of the object behind mixed with "some information of the object in front. The result has a value that has nothing to do with the location of either object once filtered. It will fall in between somewhere and is no good.


Ive never gotten this far into it, but I think they double the resolution of the depth pass because it wont be anti-aliased while the other passes will. I think.


Ahh, ok.

Well, you don’t want it to be anti-aliased anyway for the reasons explained above.

The pixel should be the depth of the object that covers the majority of the pixel, even if it’s just 51%.

Fringing around some DOF is called leaking and is a problem of DOF in post-processes. Pushing DOF to an extreme level can exacerbate the problem.


i use the sw render’s environment fog to make z channel . my english is low, maybe you can see