Deep from point position pass?

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

Thread Tools Display Modes
  09 September 2013
Deep from point position pass?

Is this possible?

I've been playing with Vray3.0's deep channel and it's pretty sweet how easy it is to comp stuff together.

Problem is that a few of my plugins are not ready for Vray3 and 2.4 won't output the deep channel.

So I'm trying to establish if/how this can be done without a deep channel. If it's not possible then no problem I'll create the appropriate mattes. If it is then I'd like to learn how.

So far I can get the point cloud from my position pass

How would I position for example a sphere plugged into a scanline render node between two objects in my render? Can it be an automatic process of just placing objects correctly in 3D space using the point cloud as a guide or is any user intervention required?
  09 September 2013
I don't think you can convert Points to Deeps - you might be able to do it via the scanline renderer, by re-rendering the points and outputting deeps, but even if you can there is probably going to be some quality loss.

Have you tried the ZMatte node? It's a bit ol' fashioned, but it does occasionally produces acceptable results given a beauty pass with a depth channel.
  09 September 2013
There is DeepFromFrames or DeepFromFrames. The problem is that a position pass suffers from the same issues as z does. You loose the ability to deal with semitransparency, motion blur and anti-aliased edges. There are a few tools on Nukepedia though for other methods of converting images to deep.
  09 September 2013
Sorry for delayed response, just moved house.

Earlyworm - Where did you find the zmatte node?

Beaker - You just confirmed what I was going to try. Whether I can get the good AA from the position path (still trying to get my head around a lot of things). Turns out I can get a deep channel from a nightly build of vray 2.0. Go vray!

On a similar note - any ideas where I can learn how to rebuild my beauty pass from render elements but using deep? I assumed it would be as per oldskool methods and all I needed to do was:

Deep to image> unpremult > shuffle and comp everything > premult > deepfromimage > and then plug this into my deep merge node.

Note: The last deep merge node has a card3d plugged into deep from image set to the correct depth. Plugging my original deep EXR2 into the deep merge node comps as expected with the card between two teapots. But when I plug my render elements comp in then the appearance is not correct.

Sorry for crappy screen grab - not hooked up to web at the moment.

Some notes about the script: I've set shuffle elements to rbg out, merges are all rbga, premults apply to all channels.

The problem with the final image is that the card3d (with chequer board node) looks as if it has been multiplied over the teapot render ie not between them at the correct depth.

Edit: sent from my iPhone - hope it makes sense, I hate typing on these things!

Last edited by MisterS : 09 September 2013 at 01:27 AM.
  09 September 2013
You should use the deepRecolor node. It in essense projects the color data onto the deep data.
  09 September 2013
Aha! That worked thanks.

With deep images, do I still need to unpremult and then premult for colour corrections?
I'm getting a light halo (rendered against black background) but it goes away if I disable the unpremult and premult nodes.
  09 September 2013
Another problem is the deeprecolor is darkening my image compared to beauty render.

Actually the real problem is trying to do some decent research with my iPhone screen for web browsing over my usual 30" monitors.

I've found that by disabling the alpha channel on any of my merge nodes, or by disabling my deeprecolor node I get the correct result. Not sure if there is a legitimate reason for this behavior.

Last edited by MisterS : 09 September 2013 at 03:02 AM.
  09 September 2013
Sorry, was thinking of zmerge. Anyway looks like your sussed on deeps.

When adding your secondaries back together, just add the rgb channels together and then copy the alpha back in before your deeprecolor node.
  09 September 2013
Sussed is a strong word, I'm getting there!

Copying my alpha did the trick thanks. I have another question though:

Out of interest - if you have the deep information in your exr, then why can I not simply add a camera to my card 3d node and set the film back and focal length to my vray camera and have a match?

I've been using the NukeEm script on other occasions which works great but figured if everything is from camera space then I'd have thought you wouldn't need to export your Max camera.

I realized there was a mismatch between the deeptopoints and card3d compared to how it looks visually correct in my final image i.e. correct depth composite. Like I said NukeEm will fix this but am trying to understand things better.
  09 September 2013
I spoke too soon - it's still not quite right. I've attached another image.

Again sorry for the image quality. There's a slight white halo which appears to be with te rawlighting (divide) node.

The shadows are identical is the three nodes ate enabled or disabled so they're not the culprit.

  10 October 2013
One thing I'm still trying to get my head around is whether I can fully eliminate halos using deep workflow.

Example. I created a multimatte element in a test scene and I rendered my scene on one pass. Objects are all the same distance from my camera, some are overlapping. I want to use multimatte to CC my 2D elements similar to the above post (I never resolved those AA issues either).

Deep read > deepcolorcorrect (multimatte channel, 0 gain on GB channels) > deep to image > shuffle multimatte R channel to 'redshuffle.r'

The above isolatrs my Red channel from the multimatte pass and is then piped in as a mask to a grade node in my 2D stream.

This appears to work to selectively allow drastic color correction to overlapping objects.

However, I get a light halo on any objects that were rendered against a black background, which have been CCed using this multimatte) All overlapping rendered objects appear to be halo free.
  10 October 2013
Turns out that there's no difference between regular shuffle of multimatte and the over complicated method I tried. Also I had forgotten to unpremult before my grade node.

On closet inspection there are still halos around my CCs. Is there a workflow to eliminate these?
  10 October 2013
Thread automatically closed

This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.
CGTalk Policy/Legalities
Note that as CGTalk Members, you agree to the terms and conditions of using this website.
Thread Closed share thread

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Society of Digital Artists

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump

All times are GMT. The time now is 05:33 AM.

Powered by vBulletin
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.