This is actually completely “normal”, although for most people, unexpected.
As a matter of fact, this effect happens in real life with real (digital) cameras too.
Here’s the thing; Anti-aliasing is done by blending some subpixel samples.
Imagine a window, whose edge covers one quarter (1/4) of some pixel.
If we are rendering in low-dynamic range (i.e. not to floating point), mental ray clips each rays color to 1.0 (i.e. “white”) before filtering.
This means that no matter HOW bright the window was (1.0, 10.0, 100.0 or 1000000.0) the ray gets clipped to 1.0 (white) and THEN blended with it’s neigbours.
So for our example with a window that covers 1/4 of a pixel, the result is a 25% gray pixel, which is what you would expect for an anti-aliased white pixel covered to 25%.
However, in floating point rendering there is no clipping happening.
So if the intensity of the window was 10.0, and it covers 25% of the pixel, this is still a total intensity of 2.5 … whiter than white (but still rendered on your screen as a “white pixel”! This is no less white than the “white” pixel that results from a pixel that is fully covered by the window. Even a pixel only covered 10% by the window still ends up “white”. Yet the wall pixel beside it (covered 0% by the window) is dark. I.e. any pixel that even “touches” the window is full white. You get aliasing!
And this is completely normal. As a matter of fact, if you change the exposure in your HDR viewer you will see that when you get to the point where the window stops blowing out, your anti-aliasing comes back!
And yes, this would happen to a superbright pixel also in a real digital camera. However, intra-pixel bleed and glare tends to cover this up, it can sometimes be evident in a photographed image.
So… how do you get around it?
Well, there are two basic fixes:
a) Do what reality would do, and cover up the effect with glare.
In a real optical system, such superbrights would spill onto neighbouring pixels due to lateral scattering in the imaging surface (film, ccd, or retina of eye) or scattering inside the optical path (lenses, the optical goo inside the eye) and - to a much lesser extent and much less than people think - scattering in the atmosphere.
There are numerous shaders and tools to do this, from “glow” filters in photoshop, the Lume “Glare” shader that ships with max, Maya Glow, etc. etc. that all “does the job”… pick one. Use it. Enjoy.
b) Intentionally “pre-clamp” the rays before filtering with a lens shader.
This is the “best” way in the sense that it gives you your anti-aliasing back. However, be aware that this pretty much precludes you from doing any major exposure changes in post production, so you must already know that the exposure you have in your image is “ok” for how you plan to use it. I.e. the operation you do will pretty much keep blown-out things pegged as “white”, and you will kill any detail in the overexposed region by doing so.
The problem with this is that doing the exposure in post is one of the main reasons of rendering to float in the first place! So it’s a double-edged sword.
Anyway, there are multiple shaders to do this: One is the simple mib_lens_clamp shader which… well… just clamps. Duh.
A “nicer” and “more gentle” method is to use the mia_exposure_simple shader, which has a bit of control allowing you to do a “gentle clamp” (with the help of the “compression” feature), or the new mia_exposure_photographic (which does it’s “gentle clamping” with the help of the “highlights_burn” paremeter). In either case use them in a “gamma=1” mode to keep the result in scene-reffered linear space (but “gently clamped”), since - I’m hoping - you are applying your display gamma in some later display step.
/Z