I was debating which forum to post this in, and I opted for this one, because I’ll present the concept from a programming perspective.
I was thinking about the subject of overexposure within the concept of a full 3d rendered scene (in particular, a scene rendered with a RenderMan compliant renderer), and whether the traditional pipeline allows for it. I’ve been reading about Cineon file formats (10 bit log) and how it allows extra headroom for overexposed pixels. However, I have yet to find anything which really addresses the concept of ‘bleeding’.
Visualize an interior shot exposed for the interior lighting, and a window to the exterior showing an overexposed outdoor environment. Film will tend to bleed, where the brightness of the overexposed exterior bleeds to the frame around the window on the inside.
From an algorithm standpoint, it seems fairly simple as a post render step: keep scanning the pixels in the image for pixels which exceed the display capability of the final output device. For those pixels which meet that criteria, apply a filter (box, gaussian, etc.) to spread the brightness outward. Keep iterating over the pixels in the image until there are no pixels which need to spread (‘bleed’) their brightness.
Now, some questions: is the above described technique a standard applicable step in any of the compositing programs, such as Shake or After Effects? If not, is the technique often applied somewhere else in a pipeline, and at what stage, and through what software? Or, is another technique applied instead?