I am working on banks of fog, and I am currently using a static grid based fluid container (no density emission) for achieving the look and spread of the moisture in the air. To get an idea of what I am doing, simply import a preset like the “afternoon clouds” from the visor.
Now because we are working in large scales in our pipeline (1 maya unit = 1 cm), I need to scale the container to spread across the environment/shot. The problem I am having is that when I scale the container to our working “scene scale”, the RGB renders fine, but in the alpha channel, any “empty areas” of the outter limits of the container are filled-in slightly This becomes more pronounced as the scales go larger again.
If you scale the container back to the default scale from the visor example (scale of 4,4,4 as per the “afternoon cloud” example) the empty areas of the texture are rendered as expected, thus RGB matching the alpha render. It seems the more you scale up the fluid, the more it fills the empty (or black values) in the Alpha. I also tried changing the container size instead of scaling the transform, and I am able to replicate the same result within the alpha.
Is there any way around this issue at current scale, perhaps a render setting I should set, or am I simply out of luck? Our pipeline is not really set up for scaling assets down, so changing something in the render settings is a better solution for me in this case.
Also, while I am on the subject of fluids, do many of you scale down your scenes when working with dynamic maya fluids? Same with issue above, I am required to work with large fluid container sizes (not scale) where the fluid size values are in the hundreds, and density emission rates are pretty high as well. To me, it seems to tax the machine more, which would make sense due to the the values being pumped through at any given frame, I just wanted to get another opinion on the workflow for fluids in other places regarding scene scales.
Thanks for the help everyone.