Usually, when I set up complex shaders, I'll do so in a clean, new scene. I never have problems fine tuning displacement this way. However, when I bring things back into the main scene, and continue working. Some amount of time, hours, or days. I'll go to change a setting in my displacement, in the approximation editor, and absolutely nothing seems to change on render. Like it was locked down or something.
Right now my issue is a scene I'm working on caps out on ram, and keeps on rising. Basically the render never even starts final gather. I keep disconnecting displacement nodes from the shading groups, and downsizing textures to the point where I know there should be hardly any memory footprint, and still the memory seems to be eaten up.
I was wondering if anyone has had issues like these, or any luck debugging this kind of stuff.
It's really frustrating when I go into the approx editor, and turn subdivisions down to, say 0 and 1. and still see the displacement presampling take exactly the same amount of time and memory it did before with 5 and 7 subdivisions.
I'm using raytracing as primary, bsp2, maya 2012
EDIT: I should also note I usually tend to use the "fine" displacement approximation method, since the help docs say it's flushable with mental ray. Is there any recommended settings or work flows?
I use alot of procedurals for my displacement too.