David, out of interest, where are you applying your view LUT?
Although I understand what you are saying, you’ve just left me more confused now.
Essentially I started this thread so we had a simple step by step workflow, if using MR, and the new colour management feature for 2011.
Could you or someone else who is working in the same way that you guys are jot down what settings you are using, as I think you’ve explained why.
(I imagine you use the Render View to see the corrected images but ultimately correct in post)
@DutchDimension: as RagingBull correctly guessed, I’m using the VrayFrameBuffer viewer LUT to preview my renders which are linear 16-bit float exr’s, which are then composited in linear and only gamma corrected in the final stages if the output format requires it.
@RagingBull: I understand your confusion. I’ve posted my “settings” on numerous occasions, and if I look back at those posts I can see inconsistencies. The problem with trying to distill the LWF idea into a few simple steps is that there are so many if’s and but’s and exceptions. Knowing the settings is not the main issue. Understanding how LWF works is more important. Once you do you can work out the settings using some simple tests and apply them in any application. Forgive me for not answering your question directly.
That’s the nail on the head right there.
I took MasterZap’s excellent MR class at fxphd when it originally started, and I’m going back through the lessons now but using Maya. So for me personally, I just want to get a workflow for CG/Live action nailed down working at home, using Nuke/AE to comp and tweak later. I understand that working in a large facility is going to have it’s own method anyway, which is why I understand David’s point about ‘bad habits’.
Perhaps if someone has the time, a brief explanation of different uses for LWF, and suggested settings would be more appropriate ?
I think for now I’m going to stick with the ‘old’ way of doing it with gamma nodes, Justins script & as it’s outlined here:
One of the reasons I mention the bad habits part is that facilities indeed have their own way of doing things. (In some cases they feel anything BUT their way is “wrong” and that gets old.)
So it’s best not to get caught up in linear workflow being just a gamma issue. It can be more complicated than that. But the workflow is the same.
The point of linear workflow is that you get out of the process what you expected by putting the correct data into it.
Paint in sRGB (perceptual space) -> linearize -> render through LUT -> comp linear passes (or neutralize non-linear ones)
You are most likely going to paint a texture in Photoshop in sRGB space (be sure your Mac is set to gamma 2.2) You then need a linear texture to render with. You can do that in Nuke with a colorspace node or you can save out an EXR from Photoshop which it will linearize for you (floating point is assumed to be linear, but it’s isn’t always. In this case you’re generally safe.)
Use those linearized textures to render. Now, your swatches will be in linear space in Maya. That’s a catch but you’ll get used to it. In most places we don’t have swatches at all.
Render through the correct LUT so you can view it correctly. Render to 16-half EXR.
Comp in Nuke (or whatever) while VIEWING through the LUT but working with linear files.
This is important because there will be cases where your destination colorspace isn’t a straight gamma change. Your primaries etc might be different. With this workflow when you linearize your textures you do so through the inverted LUT. But your workflow is still the same.
If you change packages or renderers your workflow is. . .still the same. On top of that you have a library of linearized files ready to go that are paired with your originals should you make a change.
In case your wondering how this might work, for Hereafter the colorspace was a specific LUT viewed on a Dreamcolor monitor in DCI P3. This meant we were viewing the images “exactly” how they would be viewed on a correctly calibrated movie screen. This way there were no surprises and all our detail would make it to the screen. But we also viewed the images on an HDTV with a Rec. 709 colorspace so we had to take that into account as well. So the same workflow applied but we had to make sure everything looked right depending on where it was going.
Many thanks for the clarification about gamma changes and workflow issues, much appreciated !
What didn’t help for me is that I haven’t touched on any rendering for a while now, so I knew I had issues before but wasn’t sure exactly what I did. I’m happy to use what others have laid out pretty clearly with correction nodes for now.
At least there are some tools which speed that process up:
One thing I find with students is they break and exploit any shortcoming of a piece of software. Ultimate testers in a way. Would be great to have a gamma 2.2 at least in the color management system so if an exr linear pipe is used, lens shader switching on and off is one less hassle as it would not be required anymore.
In our productions though I do usually write a script that switches the gamma correction off and on as a pre and post render mel when batching out an exr.
This would also allow for simpler 8bit workflows when gamma correction is rendered into image, allowing us to change gamma (not de-sRGB) for textures from the management system and still use lens shaders gamma control for output.
Seeing as 2012 is now about… Can anybody have a look into seeing if anything at all has changed? Im still a few weeks off using it. Does this system now work in producing matching inputs and outputs for sRGB or has perhaps has a gamma LUT been added?
I thought I’d ask this here as it’s a good a thread as any to ask.
Have you/colleagues made any changes in Photoshops ‘color settings’ manager ?
No, in fact I turn color management off when possible. Photoshop does a lot of things behind the scenes that gets confusing so I remove the chance it will mess me up. You would then be painting under your system gamma I believe which should be 2.2 (make sure your Mac is set correctly)
If someone has different or more beneficial insight there, it would be great. But for now I just turn it off.
Bitter, by “turn off” do you mean setting it to monitor color?
I generally select “no management”
I might need to revisit that in a different circumstance, but this is what we did in a pipeline I worked in that seemed to be correct. (Where I am now has an. . .interesting idea of linear color pipeline.)
I’m trying to get my head around what David said.
I understand the linear work-flow in Maya 2012
I got my image set up as EXR 32 bit my output set to linear in the color management tab
now, if I’m going to post work, Nuke for example
I need to make sure that I have the proper lot for the screen that I’ll view on.
what if the output is going directly to TV commercial.
DO i even bother rendering out EXR files?
recap my work-flow in 2012
[li]Select sRGB as your Default Input Profile. [/li][li]Select Linear sRGB as your Default Output Profile. [/li][li]Framebuffer, select RGBA (Float) 4x32 Bit[/li][li]File Format OpenEXR[/li][/ol]while working in Maya to view my linear
[li]Window > Settings/Preferences > Preferences, and under Rendering, select 32-bit floating-point (HDR) under Render view image format[/li][li]In the Render View window, select Display > Color Management.[/li][li]Select Linear sRGB as the Image Color Profile.[/li][li]Select sRGB as the Display Color Profile. [/li][/ol]at this point
A-If I want to go directly to TV (no post process)
Do I change back to output to be SRGB, now did i loose all my work?
B-if I go to Nuke, the images are “washed out” because they are in linear. we can use the cololorspace node to view them.“correct”
then write out as sRGB cause we r going to TV?
If you are going to do any post work then you are best with an EXR in linear colorspace.
Nuke also performs operations in linear colorspace just like a renderer does.
Then output to the desired colorspace from Nuke, not Maya. For TV it’s generally sRGB but for HD it can be rec. 709 which is similar, but a different gamma.
BTW, you can output to EXR RGBA 16 half instead of 32 and get nearly the same depth as before but a smaller file.
I edited my post to better describe the Q
thanks for looking
You can change the output to sRGB and not lose the internal rendering of linear color if you do this. That should be fine if you are rendering for beauty. Your output can be somewhat arbitrary if you aren’t doing post work as long as the rendering is in linear colorspace.
If you render to sRGB but rendered to EXR files that are floating point then you can use the colorspace in Nuke to “neutralize” that. It’s not recommended but would be better than nothing at all.
Seems you have a pretty good idea of how this works.
Thanks alot David!
I guess my problem was trying to view the rendered 8bit images in Fcheck. … and it cannot display linear.!
I need to look for another viewer
Any recommendations ?
but did not work for me, I think windows photo viewer was good
Here is my render and test file
TGA > still came out as sRGB >> it removed my light decay
Tiff 16 same
Tiff 32 was fine’
Viewing a single frame is easy enough in imf_disp that comes with Maya.
For motion try djview like DJX said, Framecycler, or Nuke (variation of Framecycler). Chaosgroup also has PDPlayer.