PDA

View Full Version : 16 Bit zDepth pass for After Effects


julie.jenkinson
03-19-2006, 05:38 AM
Hello !

I've been looking everywhere on the internet and in this forum but I couldn't find any solution to this..
Here is my problem : I'm using Maya 6.5 and I want to render a real zDepth pass of an animation, in 16 bits. I would use this pass in After Effects with the Lenscare depth of field plug-in.

So, when outputting the Z from mental ray :
- if I choose .iff as the file format, the z gets stored in the specific z channel. It's no use in After Effects, and if I use Fcheck to convert the sequence, it saves in 8 bit !
- if I choose .rla, the z gets stored as well in the file, After Effects recognizes it, but then the only way to actually use it is to use the "3d channel extraction" effect, which also converts to 8 bit..
- if I choose any other format, i get a seperate .iff file, where the z seems to be stored as the main channel, but this time, After Effects won't recognize the file at all ! ("error parsing file") I've tried converting this type of .iff to another format, using the imgcvt tool, but it didn't work.

Oh and if I use Maya to render the Z, I've still got the problem of not being able to use the Z channel of the .iff file In After Effects..

At that point you could say "why don't you use a depth shader and render a depth pass in any format you want ?" Because first, i want an aliased pass, but more important, i want the beginning and end of the white-black range to be calculated from the camera near and far clip "auto" values, as it happens for a normal zDepth pass.
Every depth shader that I've tried (and I think I've tried them all) doesn't work the same way as a true zDepth pass. I had hopes on one shader that has near and far clip parameters, but the problem is that at most those parameters can be keyframed, but they are certainly not automatically updated at each frame, like the "auto render clip planes" function in a camera is.

I sincerely hope I'm not the only one who has ever been in that situation !

Cheers,

j.

Randolph
03-19-2006, 11:47 AM
I can't give you an answer to your problem, but as for the anti aliased z-channel: You will have problems (artifacts) if you apply a dof effect. Check out this whole thead: http://forums.cgsociety.org/showthread.php?t=185492&page=1&pp=30

julie.jenkinson
03-19-2006, 03:35 PM
mmm yes.. but... where have you seen that I wanted an anti-aliased pass ?

Randolph
03-20-2006, 03:17 PM
Simple misreading of Because first, i want an aliased pass, but more important... sorry! :banghead:

Well, couldn't you just connect the far clipping value of the camera to the depth shader (I mean the one from highend3d (http://www.highend3d.com/maya/downloads/shaders/788.html)), using the connection editor or an expression?

julie.jenkinson
03-20-2006, 07:54 PM
Allright... I've found a command which can query the values of the "auto clips".

When i write this :
viewClipPlane -q -acp cameraShape;
I get, for example, 0.001 362.393493. The 0.001 is always there, the second value ranges from about 300 to 600 depending on the frame.
Now, this seems to be the kind of info that would be useful to set up a depth shader the same way as a zDepth pass, but life isn't that easy I'm afraid..
Randolph, I've tried putting the second value in the "Depth" attribute of the "DepthShader 1.0" shader you pointed, but it doesn't produce a wide range of grays, even if i multiply the value i get from the viewClipPlane command by a coefficient, using an expression.

I've also tried to modify the shader cmDOF : this one works with a ramp so I've multiplied the position value of the black map using the far "auto clip" value, but then again it won't give the same result as the zDepth pass. Maybe the relation between the ramp colors positions and the auto clip values is not as simple as a * b or a * b * c, and it involves square roots, but then, I have no clue what to do.

Please help if you can !! All I want is a simple 16bit or float proper Zpass that I can use in After Effects (i think i wouldn't have all that hassle with Shake as it seems to read depth buffer with no problem. But i don't use it..)

Julie

Sir-Avalon
03-20-2006, 08:19 PM
Hi,
dont know if this helps but try and set up a luminance depth shader.
This is how:

Create:
samplerInfo node
multiplyDivide
setRange
surfaceShader

Connect samplerInfo.pointCameraZ --> multiplyDivide.input1X
connect multiplyDivide.outputX --> setRange.valueX
connect samplerInfo.cameraFarClipPlane --> setRange.oldMaxX
connect samplerInfo.cameraNearClipPlane -->setRange.oldMinX

connect setRange.outColorR --> surfaceShader.outColorR
connect setRange.outColorG --> surfaceShader.outColorG
connect setRange.outColorB --> surfaceShader.outColorB

attributes for the multiplyDivide node should have -1 in the Input2 or the opposite depending on the effect.


Hope this helps?
/Chris

julie.jenkinson
03-20-2006, 09:43 PM
Thanks for the reply, Avalon. I've just tried what you said.
The sampler info node doesn't seem to have "cameraFarClipPlane" and "cameraNearClipPlane" attributes, so I assumed you were refering to the camera attributes. Also, I couldn't find outColorR G and B attributes for the set range, so I used the outValueX Y Z ones.

And it didn't work at all :( All I get is a black image.

I'd like to know what I should do to make it work because it sounds like it could be a solution.

J.

pgraham
03-21-2006, 12:51 AM
-To render 16 bit color channels in mental ray,
In render globals for mental ray, go to the Framebuffer Attributes section and change the "data type." I know Shake can read iff's with 4x16 bit data type, and I would be surprised if AE couldn't.

-To render aliased images in mental ray:
In render globals for mental ray, set max sample level to 0. Go to the Raytracing section and set Scanline to "Rapid." It doesn't matter if raytracing is enabled or not. I don't know why, but rapid render appears to disable multi-pixel filter.

I created Avalon's shader and it worked... I actually did it on my own before reading his post but it's exactly the same after those corrections you mentioned. Make sure you have that -1 in the Input2 Z of the multiplyDivide, it's very important. Also, if your far plane is very far away, your whole scene could be rendering with very dark (like, black) values.
EDIT: oh yeah, you have to set Max Z of the setRange to 1. That's important too.

Sir-Avalon
03-21-2006, 08:40 AM
Hi Julie,
I've attached a maya 6 scene with the shader setup. Sorry, yeah you have to add the camera clip plane attributes to sampleinfo node yourself... but anyway, good luck!

/Chris

julie.jenkinson
03-21-2006, 01:53 PM
... and it was also necessary to set the x min value of the set range to 1 instead of 0... ;)
Anyway, the shader does indeed work, but it is now the same problem as before concerning the auto clips : when my auto clip values are 0.001 and 350 I have to in fact put 20 and 70 as the old min and old max values. I thought I would make it work by setting driven keys between the lowest/highest auto clip values and the old min/old max set range ones, so that it would convert one range of values to another, but it didn't. Even weirder, although my "auto" near clip value (the one that is supposed to be the real one) is constant (always 0.001), if I want to have a good starting point for the white color in the render, I have to change the old min value from one frame to another.

I'm getting quite desperate now... and I'm still looking for a 16 bit solution that would work ith just the normal zdepth function.

j.

Sir-Avalon
03-21-2006, 02:07 PM
Hmm, have you tried just to uncheck the "auto render clip plane" in the attrib. for the camera and set them manualy?

/Chris

julie.jenkinson
03-21-2006, 02:53 PM
I have, but it does not have an effect on the way the shader works. It would have an effect on the result if I was rendering a real Z pass, but then it would act the same way as the depth shaders, and this is not I want.
And In case it was not clear, I'm trying to make a depth shader work like a real z pass because if it doesn't, I'll have to set up loads of keys either in Maya or in Lenscare to keep the focus region at the same place in the image...
And in case anyone wants to have a try at setting up a a shader which responds to the auto near and far clip values, here is the little bit of code to get the values (the locator is just used to add the parameters) :

float $aaa[] = `viewClipPlane -q -acp cameraShape`;

locator1.autoNearClip = $aaa[0];

locator1.autoFarClip = $aaa[1];


Julie

pgraham
03-21-2006, 03:19 PM
Sorry, yeah you have to add the camera clip plane attributes to sampleinfo node yourself...When I made the shader, I connected the cameraShape directly to the setRange in hypershade. You can get the cameraShape into hypershade by middle-mouse dragging it from the multilister.

when my auto clip values are 0.001 and 350 I have to in fact put 20 and 70 as the old min and old max values.Well, at least the auto clip values contain all the geometry. That's all the feature is there for. If you want specific planes, you could put locators where you want near and far, and create expressions on the camera attributes that calculate how far the locators are from the camera. Then you're guaranteed that the clip planes will be where you want them.

I'm trying to make a depth shader work like a real z passA "real" z-pass is however you set it up. I don't think of auto clip as the "real" way. I personally avoid features called "auto" because they're generally unpredictable.

julie.jenkinson
03-21-2006, 04:10 PM
yes.. I know the zdepth you get from the z buffer is not more "real" than other solutions, it was just a way to refer to it..

mm... but maybe I didn't really understand how the z buffer was calculated. I expected the auto near/far clips values to be used, but from what I see, it doesn't really seem to be the case. I've just remembered that in the case of a z buffer pass (let's say it that way ;) ), there is an option in the camera called "furthest visible depth". Any idea how to translate that concept into the depth shader ?

j.

Sir-Avalon
03-21-2006, 05:03 PM
If you want specific planes, you could put locators where you want near and far, and create expressions on the camera attributes that calculate how far the locators are from the camera.

Exact what I thought of when I read the post.

But I wounder if this is the sort of effect you want for your zdepth. Its what you would setup in post but this has a depth focus according to a locator you controll infront of the camera.


/Chris

julie.jenkinson
03-21-2006, 06:06 PM
Well, the lenscare plugin works with white to black images. White is supposed be in the foreground and black in the blackground. So my images shouldn't go from black to white then to black again as I can see in your image. And I don't see why I should use locators... I don't want to have a manual control on where the depth sould begin and where it should stop. I want it to be automatic and work the same way as the depth buffer : The closest visible point in the frame is white, and the "furthest visible depth", to quote the function found in the camera settings, should be black.

j.

julie.jenkinson
03-21-2006, 08:45 PM
And about the possibility of extracting the Z channel in After effects : I've been looking for a plugin capable of extracting the Z channel of a .rla file in 16 bit (as opposed to AFX's channel extract effect, which is 8 bit).
I've only found one plugin, it is called "eFX Extract3D". But the website has kind of shut down a while ago, and now it seems impossible to find the beta that was available when it originally came out..

Am I the only one who's ever wanted to use a 16 bit depth buffer in After Effects ???? :(
Pleeease help if you can, I'm losing it !

j.

Sir-Avalon
03-21-2006, 11:55 PM
Hi again Julie,

I have to look into that tomorrow, I've never had any problems setting up my luminance and getting a good 0-1 in the greyscale with the farClip plane....

Anyway, if you have the Z in the .rla have you tried to create a batch or droplet in Photoshop. As far as I know the Z is stored as alpha layer 2, this is just from ontop of my head.
Make a script that copies the Alpha 2, paste in either in the color layers or as a new image and save out out the image in the 16bit format you want, as a seperate clean RGB sequence.
It should work ( I hope)?

/Chris

julie.jenkinson
03-22-2006, 10:09 PM
I have to look into that tomorrow, I've never had any problems setting up my luminance and getting a good 0-1 in the greyscale with the farClip plane....

I can get a good range when it is just for a still image, but in an animation I cannot find a way to make it work on every frame.



Anyway, if you have the Z in the .rla have you tried to create a batch or droplet in Photoshop. As far as I know the Z is stored as alpha layer 2, this is just from ontop of my head.
/Chris
I've got Photoshop CS2 and by default you cannot open a .rla or an .iff file. I've found plugins that can open these formats on photoshop's installation cd, but the .iff one shows some kind of noisy grey thing instead of the Z, and the .rla one simply doesn't show anything. It doesn't even show the alpha.

Can you try opening an iff or rla with photoshop and tell me if it works ? I can send you a .rla file with a z channel if you haven't got one around.

Thanks,

julie.

CGTalk Moderation
03-22-2006, 10:09 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.