LWF in Maya 2011 - Ultimate FAQ ! ;)


Hi guys,

  Don’t be scared about this extensive post. This is not only a normal question. This thread is more a FAQ summary about the topic “Linear Workflow in Maya”.  I know that there are still some good tutorials about this topic, but this FAQ should attend these articles. 

  Also sorry for my bad English! :blush:


  In the last few weeks I’ve read such a mass of articles, threads and postings about LWF. There are a few but very good tutorials how to simple set up Maya for LWF. But my aspiration isn’t just the setup like “click this and that”. I want to understand also the theoretical background of this issue.

  Today I can say: the total issue is so confusing, so complex and sometimes the threads and postings about this topic are simply wrong and inaccurate!

  Based on this situation I wanna do my own Workflow with all theoretical knowledge in the background and resulting in a correct implementation in Maya 2011 (+MR).

  In spite of very good tutorials from Andrew Weidenhammer, Zeth Willie and Fredrik Averpil there are still some obscurities for me at some steps.

  I hope now, that you all can help me and other users “to bring light into the last darkness” regarding to LWF in theory and practical implementation in Maya/MR 2011.


  Ok now let’s begin

… For better understanding I created a simple scene in Maya as you can see in the picture “scene” (zip-file). Five small spheres (red, blue, green, wood-texture, chrome). Lighted with IBL and an HDRI from Dosch-Design. Also a simple directional light that created the shadows. On the floor there is a plane with a “use background shader” on it. The colors should as an example represent defined standards (like corporate design RGB standards) so you can see these defined color values in picture “color_values”. Picture “colors” show all the mia_materials (the uncorrected and the gamma corrected)

  [b]Now the questions and the problems while working with LWF in Maya: [/b]

[li]Put simply: there are three options to do an LWF in Maya (1. Framebuffer-method / 2. Gamma-node-method / 3. working with color management). The first one is simple but there are some situations that there can problems occur. It is also a little bit unclear what’s going on in the background. The last one (CM) is also very unclear and I think still buggy. I prefer the Gamma-node method. Ok it’s time-consuming but I think it is still the best and halfway faultless Maya-LWF for now. Is this summary correct?[/li][/ul]
[li]Look at picture “hdri_open”. The picture shows a screenshot of the HDRI image just opened in Photoshop CS5. As you can see the Level adjustment is still set to 1. No changes at all! Why is the HDRI looks washed out? It looks gamma corrected twice. I thought that an HDRI image should look too dark because of its linearity shown on a normal “nonlinear” display.[/li][/ul]
[li]“An 32-bit HDRI is always linear and needs no gamma correction” I’ve read this sentence so often, but don’t understand why? I can do a gamma correction in Photoshop by Levels (look at picture “hdri_correct”) or by using the exposure control. Are these adjustments just previews on my monitor and the linear character of an HDRI is still given or what?[/li][/ul]
[li]www.hdri-locations.com delivers high quality HDR images. In their FAQ is the information: “all our hdri have gamma 2.2”

…What the hell…?!! HDRI = always linear? HDRI=sometimes gamma corrected
…My mind is in haze
.Is this the same problem with my Dosch-Design HDRI?[/li][/ul]
[li]Look at picture “no_lwf”. This is the “normal” look of an image with no LWF. The” incorrect amateur” look of a render-image. No gamma_nodes, no lens_shader, no questions :)[/li][/ul]
[li]Look at picture “LWF”. This is an 8-bit .jpg working with linear workflow (all the colors and textures are gamma corrected with 0,454 gamma_nodes. A lens_shader (mia_exposure_simple) with gamma 2.2 was applied. The colors looks correct, the light travels correct through the scene, but the HDRI looks washed out. Why?![/li][/ul]
[li]Now I bachrendered an image in .hdr format, so I disconnect the lens_shader (alternatively set the gamma in the lens_shader to 1). Also set the framebuffer to 32bit float. I opened this linear picture in Photoshop and the result is screenshot “LWF_inPS”. I expected a dark image (because of the linearity of the picture shown on a nonlinear TFT display) but the picture looks correct (except the washed out HDRI-background) ?! What is PS doing here?![/li][/ul]
[li]In all the very good tutorials from Zeth, Andrew and Fredrik, they rendered an image in .exr format and set the framebuffer to 32bit float. Why? I thought that .exr is 16-bit (half)?[/li][/ul]
[li]By doing tone mapping I bring a 32-bit image to “normal” 8-bit space. This is necessary for viewing it on a normal output device (TFT, Television etc.) This is the step where a gamma correction /gamma curve (powerlaw function with exponent 0.454) is applied. Is this correct? What does “to bake gamma in” mean?[/li][/ul]
[li]To understand all the differences between bpp (bit per pixel) and bpc (bit per channel); 32-bit integer and 32-bit float images; openEXR and RGBE (.hdr) and so on, the best source for this would be the HDRI-handbook from Christian Bloch? Right?[/li][/ul]
Feel free to extend this FAQ with yout own questions!

  Many many many thanks to all of you for your help!!!

  Best regards



Ok, so. . .let’s simplify the workflow some.

In Photoshop you work in perceptual space (sRGB). You can save this file as EXR after changing it to 32-bit. This linearizes the file (most software assumes 32-bit data is linear.)

Render with that as your texture. I hear people complain this means the swatches aren’t correct. . .honestly, don’t care. Never used swatches as my final say in “look”.

Render and view through your LUT (or a gamma corrected viewer in you destination colorspace) Try not to have your 3D application correct your colorspace. (No gamma nodes, no LWF switches, etc) Feed and output linear information with your 3D package. This means your workflow is now software independent. So of your three options. . .I choose option 4.

Your textures are usually normalized (0-1) since they are diffuse textures. This means they do not emit light. HDRs and emissive textures will have higher values (1+), this is why they can still appear “blown out” despite being linear.

Render to 16-half. You sacrifice some information but it is still “floating point” and much smaller than a 32-bit file. This means a lot when working on hundreds of shots with hundreds of frames. You run out of space fast.

Do not change your framebuffer gamma. This applies an inverse curve to everything going into the renderer. Bad idea. Pretend that setting isn’t there.

  1. Paint in Photoshop (sRGB)
  2. Save as EXR (linearized) You can now also use a utility to make the EXR tiled (cached) for memory purposes, added bonus
  3. Use as texture and render to 16-half.
  4. View through something that is in your destination colorspace, be it sRGB or a film LUT.
  5. Composite in linear, output to your destination colorspace. (Reminder, some file types have a specified colorspace, etc. DPX files are standard output for delivery and logarithmic. But you should still render to something linear like an EXR to work with.)


To be honest here I still don’t understand the whole linear workflow thing.

If I render everything in Maya and then maybe ask one of my friend to do some composition etc then why do they need it? If the image looks correct as final render then you can tweak that in post to your needs. Why do I need LWF to pass the image to someone that is too dark?

I watch those tutorials on Vimeo by Zeth and even though they explain a lot they are still not clear. He does qudratic drop off for light to have physically accurate light which I tried and looks nice.

Now, most of Digital Tutors DVDs about lighting I have don’t talk or even mention LWF. eg. MR - rendering techniques - interriors so I am now confused how I can learn how to lit and produce my final image when my short movie is done.
Every tutorial talks about different things and it only gets confusing.

For example - why do I need to put gamma correction node on mia_x_material when there is no texture? Its a shader I grab from the library in maya that should be ready to go so I only tweak the colour etc and hit render, right?

Is there someone that would be able to do a tutorial for the process of how to actually shader and lit and render so people that want to focus on the creative side and not technical side can get their heads around it?

I think a lot of people would really appreciate it.
Thank you


Basically: Renderers operate with linear math. 2+2=4 When you provide data (textures) that are not linear, they have a curve applied to them. Generally an sRGB colorspace that’s designed for our eyes, not our computers. This means your materials will respond to lighting and material properties incorrectly. This makes the job of lighting and getting a physically plausible look more difficult.

Linear images look dark because you are viewing it in linear space. Your eyes do not operate in linear space. In fact, since the dawn of television, (NTSC colorspace) they have corrected the information being displayed to work for your eyes.

When your friend needs to composite your work, they need linear data because again, the compositor software operates in linear space. They should be viewing the image in perceptual space as they work.

It’s like putting the wrong gas in your car. Your car will still get you where you are going, but you’re not doing your car any favors by giving it the wrong kind of gas.


I’ll try to tell the tale of LW so please correct me if I’m wrong. Cause this is a topic where people constantly get confused including me. :argh:

Images shot with “early broadcast cameras” were not displaying right (linear)
on “early monitors” due to an input-output conflict with video cards etc…

So a smart guy called Gamma (joking) came up with the idea to color correct the image so we see things right (linear) on those monitors. It was a simple color correction that just readjusted the unlinear input/output curve in a opposite fashion to get things to look straight (linear) again. :wip:

The value that got things linear/straight/right was about “2”, later everyone agreed to “2.2”.
So practically every image that was digitally shot and destined to be displayed on some kind of monitor got this gamma correction treatment. An early technical limitation evolved into an industry standart that confuses most digital artist on earth today. :cry:

Today when we browse our texture library we must know that our images have been color corrected for us, so our eyes perceive it right on our monitor. :surprised

But renderers don’t know anything about those human perception problems and treat textures as they are. So we have to get rid of the color correction that is automatically applied by all digital cameras today if we want our renderer to calculate things right.

I wish somebody would replace all digital cameras and monitors (that inherit 50 years old technical limitations) with ones that don’t apply gamma correction/don’t display gamma corrected images. But till that happens, we have to adapt to some kind of linear workflow.

First thing to know is, “don’t adapt to any LWF if you are not rendering photoreal images”
Don’t adapt to LWF if you output every aspect of your image as 32 bit passes. :lightbulb

If your project requires photorealism and you don’t output everything into 32 bit passes
do the following: :wise:

  • Make sure your framebuffer display setting is set to sRGB 2.2 (default for most)
  • De-gamma all your textures (in photoshop set gamma value to the reverse of 2.2 = 0.45)
  • De-gamma all your color swatches (decrease saturation values of colors swatches)
  • What you see in your framebuffer will be correct both from a human and renderer POV
  • output as 16 bit half or 32 bit float exr for further linear composition purposes


Your story needs a few corrections I’m afraid. I’ll try to be brief.

I disagree. I think you will benefit from understanding and using a LWF no matter what you are doing.

This is wrong. Bad advice.

I would reword this. Your framebuffer (we’re talking renderview or vray frame buffer) should either be set to use a LUT for sRGB or you need to use a lens shader on your camera to allow you to preview in sRGB. None of these things are default I’m afraid.

De-gamma all your 8-bit textures. Either do it in maya using a gamma correct, or the color management etc. Do not save 8-bit textures after degamma-ing in photoshop as this will result in banding in the shadows. You must save as EXR if you do that. (16-bit TIFF at the very least).


Guys, thank you for your replies but here is my dilemma.

I would like to do something as easy as possible.

I’m creating a short movie.
All my shaders will be mia_x_material except few
Some will have texture file attached to them (eg wood, walls etc)

Some shots are inside and some shots are outside which I will then use physical sun and sky.

Now, what is the easiest way for me to do shading, lighting and texturing?

My compositor friend might tweak some of the final images and he says that all he needs is things like z depth, alfa etc.

What would you advice me knowing that I can’t get my head around the linear thing as everyone is saying something different, please?

Also, any use of maya 2011 colour management or are there any good and easier things in maya 2012 so it would be better for me to upgrade?


Okay, here’s the most straightforward way to implement a linear workflow in Maya 2011. This post is long but the work required is minimal.

If you’re working with regular textures (downloaded from the internet, taken with a digital camera, etc.), they need to be gamma corrected. This is just a mathematical operation to convert the color space of your images from sRGB to linear. Why? Because Maya is trying to render a linear image and you want to composite in linear space for myriad reasons. If you’re feeding it sRGB textures, you need to compensate for that or you’re not using linear workflow. (Note that Bitter’s suggestion above involves using textures that are already linear, by saving them as .exr. If you do that, don’t gamma correct them. We’re talking .tiff and .tga and png here.)

All you have to do is insert a gamma node between the file texture and the material input (for every material). Then set all three values to .454. Annoying? Totally. Maya 2011 has built-in color management that’s supposed to handle this automatically (Color Management in Render Settings, Common tab) but the consensus on CGTalk is that it doesn’t function correctly. Avoid turning it on. Instead, use Redsand’s handy script that will automatically insert gamma nodes for your materials:


Keep in mind that the script above will also gamma correct color swatches because otherwise when you render, your color swatch values for lights, etc. will be lighter and less saturated. If this happens with a swatch or a file texture, check to make sure it has a gamma node – though with swatches, you can just eyeball the value if you want. But if you’re using Redsand’s script, you should be good.

Render to .exr, half (this will save you space over full float, but it’s your choice). If you render to 8-bit .jpg, I will throw Cheetos at you.

By now you should have Maya rendering a linear image with all the color swatches and textures in the image having been fed to your renderer in linear space. Great! Unfortunately, we use sRGB for a reason and you will want the image to end up in sRGB after compositing*. You need a way to preview what your linear image looks like in sRGB space. The easiest method is as follows:

In your Render View, go to Display > Color Management. Change Image Color Profile to Linear and keep Display Color Profile set to sRGB (gamma corrected). This tells the Render View that you’re feeding it a linear image but want to view it in sRGB space. It does not actually change the image file, which is good because you need it in linear for compositing.

I have no idea what you use for comp so I can’t provide a step-by-step here. I believe Nuke expects linear image sequences, for example, so you’ll be working in linear space by default. With After Effects, you need to specifically tell it to work in linear space. Here’s a quick run-down for AE CS4:

File > Project Settings > Color Settings section

Set Depth to 32 bits per channel (float). You can set this to 16 for speed, but be aware that glows and blooms will composite differently.

Set Working Space to sRGB.

Notice that the Linearize Working Space checkbox is now available? Turn it on. Now AE will do its compositing math in linear space but show you a preview of what your comp looks like in your Working Space (in this case, sRGB). Hooray!

The last step is making sure that your Output Profile is set correctly when you add your project to the Render Queue. Go to Output Module > Color Management and make sure the Output profile is set to your Working Space profile (in this case, sRGB).

Bam. Go make yourself some tea and wait for the ‘ding’.

*You don’t always want sRGB – depending on what you’re doing, you might aim for a variety of different color spaces including a custom film LUT, but this explanation is simplified for people making images or movies that will be shown on a computer screen. People write books about this stuff.


  1. Gamma correct your textures and color swatches, assuming your textures are sRGB (they probably are).
  2. Use this script to accomplish step 1.
  3. Use a good output file format for your renders, like .exr.
  4. Tell Maya’s Render View to preview your linear image in sRGB space.
  5. Set up your compositing package to do the same.
  6. Thank MasterZap and djx and everyone else on CGTalk for beating this into our brains in the nicest possible way.


Dilemma for sure, because once you understand and begin using LWF many things actually become easier, especially lighting a scene. Unfortunately many struggle with the concept.

If you’re in a hurry to get started and don’t want to bother with it, then don’t. I’m 20 years into my career and for the first 15 I’d never heard of LWF. Trust me when I say, at some point you are going to have to get it though.

I recently spoke with a seasoned flame operator, whose work I’d admired, and he asked me why the images I was giving him were so dark. I told him they were linear and he said, after a pause, “Why cant everybody just stick to sRGB?” and then, “Don’t worry cos they’re 16-bit float so I’ll just add some gain”. I said, “You mean gamma?” He said, “Yeah, whatever. As long as they look right”. I was horrified. All my hard work getting my renders to look great was going to be for nothing.

Its your film so you can do it the right way or the wrong way, what ever that is and what ever way works for you - “as long as they look right”.



Thank you very much Jipe, it does sound easy especially with the script.

Now, do I have the correct settings here for frame buffer and render? (see images)


I would avoid changing the Framebuffer gamma (just set it back to 1 and ignore it). You are accomplishing the same effect by inserting gamma nodes for color textures and switching the Render View to display linear as sRGB.

One additional note: if you’re using Physical Sun & Sky, I believe that will automatically attach a lens shader to your camera with gamma correction (exposure 2.2, I think). This is redundant if you’re using the method I described, so either change the exposure value back to 1 or delete the lens shader.

I know, right? So many buttons to push. But it’s worth it, in my opinion. More realistic light falloff and reflections and GI propagation, improved motion blur… MasterZap has a long list of advantages in class 1 of his mental ray course over at FXPHD, and it sold me on the idea pretty quickly.


OMG, made me laugh.

It’s true honestly. When I was working on 2012 we worked very hard to operate in the correct colorspace, which for film can be dictated by a film LUT and not sRGB. After all our hard work it goes to the production company and their colourist.

It was like we rendered day-for-night.

But as long as you follow the LWF you end up with a product that is most easily correct/tuned/changed to suit your needs. No point in fighting yourself when a few simple steps can make your life in post a LOT easier. To say nothing of materials that operate more predictably. Doubling your light intensity shouldn’t blow out materials, etc. It should simply double the brightness.


Hi guys,

  first of all many thanks to all of you for your help and your efforts.

  Ok, first a word or two about my first thread. Let me put it like this: this should [b]NOT[/b] be a thread about  “what is LWF in general and how can I set up Maya for LWF?”!

  [b]If you don’t know how to set up Maya for LWF, please read the tutorials/blogs from Andrew Weidenhammer, Zeth Willie, Fredrik Averpil or the from djx![/b]

  There are in general three different ways to do that. I prefer and also recommend to work with gamma_nodes (regarding to the blogs from Andrew, Zeth, Fredrik and djx). You can also do it like the explanation from Bitter by linearizing your textures in PS. I don’t have any experience with this method but I think it may cause you trouble with a heavy amount of memory (for example a huge amount of Arroway Textures etc.). You also have to manually correct your color swatches in Maya! Do I have this right?

  [b] If you don’t know the basics about LWF in theory or don’t know why gamma correction in general is needed, please read the gamma-FAQ from Charles Poynton [/b][[b]http://www.poynton.com/[/b]](http://www.poynton.com/)

  [b]Please bring to mind: [/b]

  ·         Physical linearity is not the same what our eyes see!

  ·         Gamma correction is not only for correcting the non-uniformity of our eyes! That’s would be only the half story!!

  ·         There are in general three different power-law functions with different exponents (gamma) : perceptual response, technical characteristics of monitors, corrections (gamma corrections)

“ I disagree. I think you will benefit from understanding and using a LWF no matter what you are doing”

  Totally right!

“I disagree. I think you will benefit from understanding and using a LWF no matter what you are doing.”

  Quite true!

  I totally agree with djx that LWF is very helpful for all CG-artists. It is the solution for better and especially more physical correct renderings. No matter if you just use outputs in 8-bit (with exposure_node 2.2) or if you want to make linear 32-bit outputs for compositing!

  Please read the Tutorials from the above-mentioned guys to set up Maya for LWF! I recommend: DON’T use the frame-buffer method!


  I hope some of you can assist me in my above-mentioned questions. LWF in general is not my problem. I mainly have problems to understand what’s going on with my HDRIs. Please take a look at my first Post an my attached files.

  Thanks to all of you! Nice community!

  Best regards



Ok guys,

the script and the advice here really helped. There is one more issue I don’t know how to solve.

See the picture below

1st - image rendered in render view and showing as linear (dark as it should be, right?)
2nd - image rendered as 32 float exr opened in photoshop (so, correct final image, right?)
3rd - same image as 2nd but opened in preview so it shows the image as it is. Why is it overblown with light etc when render view shows the correct linear

Also notice (which is something I don’t understand and don’t know how to solve) is that 1st image has correct physical sun and sky background whereas 2nd and 3rd has nothing so its completely transparent. Why is that and how can I fix that? No settings was changed for those images. Everything is the same frame (except the snow is off in render view) but render view is completely different than batch render.

Any ideas? Also, if render view is what it is what is the best way to apply sky as a background in maya? Any simple solution?

Thanks in advance, it looks like I’m getting somewhere with this :slight_smile:


for openEXR in Photoshop a plugin is required!



I don’t have a problem opening the file. Photoshop opens the file without any issues same is for Preview app.
The issue is inconsistency and lack of background :slight_smile:


PS do not work correctly with openEXR files (missing chanels). So I recommend again: use the plugin! You will see :slight_smile:


How do I install it? I’ve downloaded the one OpenEXR Photoshop plugin for OS X

but its not an installation file.
Also, would you know what is causing the no background issue?


Also, I’m missing the point here. I don’t use photoshop at all. I only opened the file in 3 programs (renderview, preview and photoshop) to show the differences. If I install the plugin then how is that going to help me with my LWF problem? Am I missing something important here?

Thank you in advance


Photoshop handles EXR ok, but it’s not the best tool.

You lose the background because it knocks out anything without an alpha.