This always gets complicated. . .but it’s not really.
If you use the regular Camera Depth renderpass, switch off Filtering.
Render to at least RGBA 16 (half) to get the correct precision under Framebuffer in Quality tab. (EXR filetype)
If you need to know the depth you can open the render pass from the Maya Renderview. File>Load Render Pass This opens imf_disp. Select Layer>depth (whatever the image name is with “depth” appended.)
If you pass the cursor over the image you will get pixel values at the bottom. You can even change the Exposure to see the depth levels. If you drag your cursor over where you want focus, it will give you that exact pixel value you can use in Nuke.
Sidenote: Fcheck, when you hit ‘z’ on the keyboard will usually give you the depth near and far values in the terminal window as a little printout. However, I’m having issues with 16 half or 32 bit exrs. You’d think someone would rewrite Fcheck by now. . . .
I have no idea why you’re being told to render it separately or render it twice as big etc. It’s not necessary for correct Z-depth unless there’s a bug I am unaware of.
You don’t want anti-aliased z-depth because the pixel will contain “some” information of the object behind mixed with "some information of the object in front. The result has a value that has nothing to do with the location of either object once filtered. It will fall in between somewhere and is no good.