Working in a team environment, I’m constantly updating a series of XRef’s (object/material change) and asking the larger team to re-load those xrefs multiple times each day.
The project I’m on is Redshift based, uses mostly custom node based shaders, but also half a dozen substance shaders throughout various shots.
As the weeks have progressed, the project files got exponentially more complex to work with; animate; open or save. We assumed they were just heavy files until I went digging for a problem yesterday.
Checking the Substance asset folder in a colleagues work, it appears that each time the xref was reloaded, a new copy of the substance shader came with it. Resulting in over 100 duplicates of the same substance, not to mention a substance cache folder that had grown to around 80gb, that should have perhaps been 100mb.
Using a ‘Remove unused’ or ‘remove duplicates’ from the substance asset folder; kills all substances.
So the solution is to simply bake them down individually and ensure we’re using texture driven mats only.
What I question though, is the how and why on earth this would happen? And whether or not anybody else has experienced something like this with a similar workflow?
It’s a major flaw in C4D or perhaps Redshift, for a Substance to require reloading each time and xreft is refreshed.