PDA

View Full Version : Stereoscopic rendering problems


pixelranger
10-26-2004, 09:32 PM
Hi folks.

For an upcoming project I'm required to render out a cinema comercial with stereoscopic effect like you've all seen in Spykids 3D. ;)
I've done a couple of tests with LW's internal Stereographic rendering features, and allthough it looks great it seems to leave out the "left" alpha in the resulting image. And that makes it difficult to comp it in post.

Is there a way to render out the composed stereoscopic images and keep both alphas?

Rei Ayanami
10-26-2004, 10:02 PM
You could do it manually, and have two cameras parented to a null and targeted to the same null. That would allow you to tweak each separately too, dont know if LW's native stero is faster at rendering it twice tho.

scotttygett
10-27-2004, 03:43 AM
I know what you mean; I almost mentioned that in the "How to Improve: Renderer" thread, but red/blue is so archaic.

I heartily discourage full color red/blue renders, since the full 3D effect is so compromised by the color leakage, which SHOULD mean that all I have to do is stack the black & white filter with the anaglyph filter, and "voila" -- I should get a finished red/blue movie. But this happens about one frame in 100, and I don't know if it's a program bug, my ignorance or a graphics card issue.

Here's what I wind up doing: first, I render the sequence in 3D. You can do it in separate folders if you like. Then I make a separate copy in black and white.

I create a Scene file with Anaglyph Compose rendering to Quicktime, where the interocular separation of the two eyes is huge. I have two front projected image cards (0 diffuse 100 luminous), one for each sequence, small enough to not overlap, but close enough to fill the view. I think the camera's default wil be the position between the to, so it may take a little trial and error, or making a ruler out of nulls.

I suspect that DVD image quality makes possible better red/blue 3D than ever before. Ah, the lost aesthetic of watching a movie with red and blue filters...

pixelranger
10-27-2004, 08:33 AM
Thank you very much for the quick replys!

Will Quicktime fix the alpha problems? What about the front projected cards? Do they have the prerendered sequences' alpha channel controlling the cards' transparency/alpha?

I don't completely see the difference between rendering the sequence once with the Anaglyph compose filter applied.

geoff3dnz
10-27-2004, 09:52 AM
You could do it manually, and have two cameras parented to a null and targeted to the same null. That would allow you to tweak each separately too, dont know if LW's native stero is faster at rendering it twice tho.Actually, the cameras should be parallel, i.e. both pointing straight ahead, not targetted at the same null.

scotttygett
10-27-2004, 10:09 AM
It's a while since I've done it. Never mind about rendering to Quicktime. My bad. Render image sequences.

For a scene render file named scene.tga, the plug-in will create two batches of images, SceneL001.tga which will apparently not be combined with the right image, and SceneR001.tga which will be your red/blue combined sequence. Since the R and L's are added before the numbers, this makes loading image sequences in or out of LightWave pretty simple.

I have to apologize, when I read your post initially, I thought it was much less technical. Did you rewrite it? As for enabling or disabling alpha images, I know very little about these hidden mask images -- don't ask me. I use zipped sequences when I can, which is supposed to be most of the time.

Adding a null at a convenient position for the camera to use as "target" makes a world of difference in 3D quality, as well, and there should be some threads about 3D at CGTalk and elsewhere. (If you don't use a Null, everything will be poking out of the monitor, and nothing will be inside it, unless you crop and composite. It pays to keep some stuff poking through; it seems to read better.) I'm going to skip this debate, since I don't know if you need it. For cinema projection, you may be surprised that the dimensionality can look much different from the monitor though.

I prefer to add a black & white desaturation stage for what should be a pretty obvious reason, the stereo effect suffers when either eye sees both eyes' images. Your planning on doing a lot of compositing sounds on track.

It could be that the LightWave plug-in is more sophisticated than I realized, but I think that if you want to have one eye see green/yellow and well as blue, you need to copy that violet/blue/green/yellow spectrum to the red eye. Using conventional monitors, if I remember my photometry, this is a barely noticeable effect. (I think a man named Gibson had a lot more success with a film system, since film captures deep violet, which can be flouresced or art-directed to yield false-red full color in the blue eye. This isn't an option with TV or projection TV, and really, it's something you have to force with RGB CGI software. And the Gibson method leans toward one eye knowing its not seeing color, though one can try to art direct around it.)

What a bunch of 3D video films like "Spy Kids" have done, is art-direct toward red/blue pallettes. Grays, beiges, and little accents that aren't caught in stark profile, with pale reds that are visible in the blue eye as green/yellow. Colors like blue go "metallic." I don't like the look. Some think it's fine.

Please yourself, and your audience. Chances are, the image will look too dim anyway, so you'll process it lighter, and the color won't be much of an issue. If you're going to film, the extra effort can be rewarding. Don't arbitrarily mix xenon screening rooms with tungsten projection, as the results should be different, 3D-wise. Some would argue for retiring monitors that drift too much during the workday, but I don't know that I'd go that far, the eyes adjust to a lot.

Have fun -- 3D is such a roar.

Rei Ayanami
10-28-2004, 11:51 PM
Actually, the cameras should be parallel, i.e. both pointing straight ahead, not targetted at the same null.

Yes, i was thinking of how my eyes work not how computers work (i think my eyes aim towards a common target)

SplineGod
10-29-2004, 04:50 AM
Actually Im working on a project thats being done in 3D. You have to worry about two things: 1. Intraocular distance (distance between the eyes)
2. Convergence - where both camera target. Its the focal plane. Anything between that null and the cameras will be 3D.

You dont want the cameras to be parallel but actually focus at that convergence point.

scotttygett
10-29-2004, 08:25 AM
Larry, you dog! Rock on!

I seem to have briefly seen a technical paper about toe-in. The gist probably had to do with later sliding and re-adjusting things. There's an area of optics where wizards show that a 5 mm lens on a 4x5" photo plate and a 25 mm lens on 35mm have the same depth of field when you crop the film, really unexpected stuff. So if geoff3dnz says to keep them parallel, I'm open to that, though most of us choose to accept the proscenium as the "window" plane and settle on toe-in at the camera.

(I just tried parallel, and it looks great, though it requires cropping.)

www.misteranimation.com (http://www.misteranimation.com) has one page with an anaglyph 3D background, but since nobody's supposed to know the page exists, it's just a cyber-parking-pylon. All the 3D is behind the window, but for a title sequence that comes after that image, I put as much as I could ahead of it.

If you don't have the proscenium at the window plane, you either need a really big screen like IMAX recommends -- they're rightly credited with showing that huge screen 3D is more aesthetically successful -- or to make your peace with the edges looking a little odd when stuff appears cut off. The Stereoscopic Society of Southern California used to show 3D color slide presentations monthly, and stuck with 1950's style windows, and the realist used parallel lenses, but shifted like view cameras. The still projection equipment lets you adjust each picture as you go along, if you want.

Art direction of stereo 3D is fun; when you're not adding scenic interest or sneaking Pulfrich effects in, you're contriving something goofy like morphing the landscape to tilt like a stage rake.

geoff3dnz
10-29-2004, 09:15 AM
Actually Im working on a project thats being done in 3D. You have to worry about two things: 1. Intraocular distance (distance between the eyes)
2. Convergence - where both camera target. Its the focal plane. Anything between that null and the cameras will be 3D.

You dont want the cameras to be parallel but actually focus at that convergence point.Well you do want the cameras parallel if you're rendering out imagery for 3d lenticular printing (I was doing it for 6 years). Having the cameras converge on a common target doesn't produce decent results. Perhaps it's different for film.

SplineGod
10-29-2004, 10:24 AM
Thanks Scott! :)
Your eyes dont stay parallel unless youre looking at something quite far off and then you dont get a 3d effect. A simple way to test this out is to render left and right eye views and map them onto a couple of polys that are side by side. You can view them cross eyed and get a nice stereo 3d effect. :)
Geoff,
I havent done any lenticulars. Definately for film you want the cameras to converge on a single spot. Usually you leave the intraocular distance the same but sometimes you have to cheat that too.

Eugeny
10-29-2004, 10:53 AM
Well you do want the cameras parallel if you're rendering out imagery for 3d lenticular printing (I was doing it for 6 years). Having the cameras converge on a common target doesn't produce decent results. Perhaps it's different for film.

No it's not different , best results can be achieved with parallel cameras. The funny thing that eye separation is depend on the size of objects, so some times it's going good with 6-10cm and some times with 5-6m. Also it's possible to change eye separation on the fly (we do that with parallel cameras) to achieve best stereo effect.

One think that i can suggest is not to use LW native stereo render, make camera , use stereoscopic render to find the best eye separation and then add left and right child cameras with X position based on the eye separation (obviously each camera get a half of eye separation number).
I can't say anything about Anaglyph Compose rendering to Quicktime because we use the Stereo Compose plugin (which come with LW) and QV image viewer (search Flay) for testing . Then we make two scenes with left and right cameras with output to different folders. If we need blue and red stereo pair we compose it in After FX (cult 3D plugin for After 5 - 5.5 or Channel-> 3D Glasses in After 6).These two are better then LW native stereo compose because u can exaggerate the eye separation and make blue and red less or more saturated.

geoff3dnz
10-29-2004, 11:15 AM
Using 'converging' cameras produces a pseudo rotational effect (most significant at the point they converge on) which is incorrect. Using parallel cameras will produce an infinite 3d effect, as opposed to one that is only between the cameras and the focal point.

scotttygett
11-05-2004, 09:14 AM
I was just thinking, there might also be some approaches where you would plan a palette, since I think there are quite a few graphics filters based on the original GIF standard where you would sweat choosing/designing each of the 256 colors you got to use.

MPEG/DVD uses a compression that is lossy, so it couldn't hurt to do a little of this, and one either would or wouldn't go along with having the red filtered eye see equal shading. The one peculiarity of cone vision is that red doesn't have a rod component, so your eyes don't adjust to it. This is why dark dials are red, so that one can switch the lights off or on. Red is always the color red. You would think this would encourage 3D folks to have the green sensitive portion part of the red, so that the full color eye would be the red lensed eye, but I've seen maybe one filter in a dozen that did this. Usually the blue eye is the full color eye, despite the tendency of violet to give a black&white sensation.

CGTalk Moderation
01-19-2006, 05:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.