wether arnold is usable at all on a single pc?..maybe a single frame gpu…maybe not
animation gpu (as cannot denoise in noice)
GO TO BOTTOM POST IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc
the windows default is set TdrDelay2 and TdrDdiDelay5 if you need to reset, though substance
and some wacom photoshop different settings as well.
outside a vfx company with a render farm, i doubt it, its noise issues especially on
interiors, or anything just using emission shaders, the noise arnold makes it not unusable,
you just wont get a render out because it would take weeks to sample an animation at a high
enough aa , or even using noice denoise, which is very slow per frame.
and noice ignores usual aovs ( would it be quicker as noice standalone outside maya? or a
standalone denoise in nuke or aftereffects)…this was cpu only
blender seems to denoise faster, it has bucket auto size scripts to maximise speed, granted
for blender its id cryptomatte pass and glass shaders are shit…does arnold have auto bucket
size not sure?
any way back to arnold
to boot anytime you use gpu aa above 3 it will crash, blender seems to mix gpu(any old gpu
unlike arnold), and cpu far better and does not crash,…
NOW SOLVED by setting TdrDelay60 and TdrDdiDelay60 as per unreal doc
update partial solve in maya is prepoulate gpu seemed to improve performance…arnold needs a
graph to show its vram usage to avoid crashes…why do we have to prepopulate a gpu to make
it work?
arnold always trys to use cpu at 100% which will crash your computer,
update partial solution you can then set used threads to one less on your system do not use
auto set,
BUT you cannot on arnold render sequence or noice denoise it uses all threads regardless,
ie they seemed to have have designed arnold to crash on a single average computer
i guess if you have $4K to spend on a gpu not a problem, but i think arnold needs a faster
less accurate ray trace method for one pc or carry on using blender cycles…
after more research found arnold maya
it cannot do a noice denoise on a gpu render sequence or batch sequence at present,
prepopulate gpu did stop some gpu crashes in tthe end only doing this
set TdrDelay60 and TdrDdiDelay60 as per unreal doc fixed the crashing
noice error on a gpu render sequence …error was
“Could not find AOV with source RGBA and filter variance_filter
Could not find variance for AOV “RGBA”, skipping denoise.”
see
ie it does not create a variance layer on gpu rendered sequences so noice denoise fails.
why is noice denoise vital for arnold because its the slowest noisest renderer on the market,
without it renders are way too slow , too grainy especially on interiors.
and why does the gullible maya operator have to spend 2 days to learn this crap, when its
obvious it should have gone out working in version 1 arnold for maya.
ditto
set TdrDelay60 and TdrDdiDelay60 as per unreal doc
so currently a single machine renderer is better off using blender cycles.
(this is not a criticism of the devs, i couldnt code a renderer, its more a comment on a…desk
operators working in companies need a viable fast noise free animation solution for a renderer…they
dont want to spend days with gpu crashes (oh you havent got the right gpu card, driver update maya b…s)
…in this case (right TdrDelay60 and TdrDdiDelay60 as per unreal doc) unlike blenders any old gpu will do.
and blender uses default TdrDelay and TdrDdiDelay.
, and then find arnold cannot denoise anyway on a gpu rendered sequence…haha dumb operator…
well one experience like that and the boss uses redshift, octane , blender cycles,whatever.
just sums up mayas current approach to its market)
ok will just have to accept autodesk and nvidia are only interested in rtx cards, for their software and drivers,
it is what it is…end of story…