Arnold , interior noise issues, gpu render sequence and noice denoise failing to work gpu crash fix


#1

wether arnold is usable at all on a single pc?..maybe a single frame gpu…maybe not
animation gpu (as cannot denoise in noice)

GO TO BOTTOM POST IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc
the windows default is set TdrDelay2 and TdrDdiDelay5 if you need to reset, though substance
and some wacom photoshop different settings as well.

outside a vfx company with a render farm, i doubt it, its noise issues especially on
interiors, or anything just using emission shaders, the noise arnold makes it not unusable,
you just wont get a render out because it would take weeks to sample an animation at a high
enough aa , or even using noice denoise, which is very slow per frame.
and noice ignores usual aovs ( would it be quicker as noice standalone outside maya? or a
standalone denoise in nuke or aftereffects)…this was cpu only

blender seems to denoise faster, it has bucket auto size scripts to maximise speed, granted
for blender its id cryptomatte pass and glass shaders are shit…does arnold have auto bucket

size not sure?

any way back to arnold

to boot anytime you use gpu aa above 3 it will crash, blender seems to mix gpu(any old gpu
unlike arnold), and cpu far better and does not crash,…
NOW SOLVED by setting TdrDelay60 and TdrDdiDelay60 as per unreal doc

update partial solve in maya is prepoulate gpu seemed to improve performance…arnold needs a
graph to show its vram usage to avoid crashes…why do we have to prepopulate a gpu to make
it work?

arnold always trys to use cpu at 100% which will crash your computer,

update partial solution you can then set used threads to one less on your system do not use
auto set,
BUT you cannot on arnold render sequence or noice denoise it uses all threads regardless,
ie they seemed to have have designed arnold to crash on a single average computer

i guess if you have $4K to spend on a gpu not a problem, but i think arnold needs a faster
less accurate ray trace method for one pc or carry on using blender cycles…

after more research found arnold maya
it cannot do a noice denoise on a gpu render sequence or batch sequence at present,
prepopulate gpu did stop some gpu crashes in tthe end only doing this
set TdrDelay60 and TdrDdiDelay60 as per unreal doc fixed the crashing

noice error on a gpu render sequence …error was
“Could not find AOV with source RGBA and filter variance_filter
Could not find variance for AOV “RGBA”, skipping denoise.”

see


ie it does not create a variance layer on gpu rendered sequences so noice denoise fails.

why is noice denoise vital for arnold because its the slowest noisest renderer on the market,
without it renders are way too slow , too grainy especially on interiors.

and why does the gullible maya operator have to spend 2 days to learn this crap, when its
obvious it should have gone out working in version 1 arnold for maya.
ditto
set TdrDelay60 and TdrDdiDelay60 as per unreal doc

so currently a single machine renderer is better off using blender cycles.

(this is not a criticism of the devs, i couldnt code a renderer, its more a comment on a…desk

operators working in companies need a viable fast noise free animation solution for a renderer…they
dont want to spend days with gpu crashes (oh you havent got the right gpu card, driver update maya b…s)
…in this case (right TdrDelay60 and TdrDdiDelay60 as per unreal doc) unlike blenders any old gpu will do.
and blender uses default TdrDelay and TdrDdiDelay.
, and then find arnold cannot denoise anyway on a gpu rendered sequence…haha dumb operator…
well one experience like that and the boss uses redshift, octane , blender cycles,whatever.
just sums up mayas current approach to its market)

ok will just have to accept autodesk and nvidia are only interested in rtx cards, for their software and drivers,
it is what it is…end of story…


#2

testing on a simple vfx office machine
added
IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc
the windows default is set TdrDelay2 and TdrDdiDelay5 if you need to reset, though substance
and some wacom photoshop different settings as well.

arnold
cpu i7 7threads 16 gb ram

scene one emission shader on top of columns no other lights
instanced geo 20 columns with carpaint shader and a floor

sample cpu only aa 6 2 2 2 2 2
time per frame 13mins 50s
very noisey unusable default arnold render

then
noice denoise cleanup additional 5 mins per frame now usable,
but no additional aov passes all lost…total clean render is 18min 50s per frame

cannot use gpu gtx1660 can render but arnold cannot use noice denoise on a gpu render
, renders faster,acually not much faster
but gpu aa 24 2 2 2 2 2
takes say 7.5mins per frame gpu still too noisey

gpu aa 48 2 2 2 2 2
takes say 30.43mins per frame gpu still too noisey
and is still say 25% worse than a noice denoised frame
and cannot use noice denoise on agpu render as no variance layer

please see attached more accurate render settings and times
https://www.flickr.com/photos/93465359@N03/52264292626

i do not know what the benchmark time for an animation should be but say its 4 nights rendering mon to thurs
8 hours per night, that means a maximum of 32 hours for 3 mins animation is 4320 frames at 19mins per frame
is 1368hours /32 hours is 42.75 weeks…but maybe i am crap at maths who knows.

probably why people are desperate to use unreal
to be honest at this task unreal would look better


#3

tried to avoid the arnold emission shader route to light the scene,
so as to reduce noise by using 10 substitute area lights and geo masks,
this obviously is slower making all the geo and trying to fit a square light in curvy geo,
without intersections into geo good luck with that stopping bad reflections,
then arnold cannot instance area light? not sure said it cant.
anyway cpy render sample aa 6 2 2 2 2 2, at 1920 1080 took 48mins was clean no noice denoise needed,
but at 48mins per frame unusable WAY TOO SLOW for an animation.

then tried gpu aa 12 2 2 2 2 2 on gtx 1660 vram, crashed maya completely wiping out the program,
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc

conclusions pity the maya operator forced to use arnold in their local vfx house,
on interior animations, too noisy, too slow

please see updated
https://www.flickr.com/photos/93465359@N03/52270685903

as i say not criticizing arnolds use in big vfx houses on large scenes with massive render farms,
just personally could not stand using it in our local small vfx comapnies for the reasons outlined above,
it would bugger any operator up and has done big time regards its noise level.

actually found this on polycount may help others in same boat
is a scfi dark corridor same issues
[https://polycount.com/discussion/210501/arnold-for-3ds-max-noise-and-general-rendering- questions](https://polycount.com/discussion/210501/arnold-for-3ds-max-noise-and-general-rendering- questions)

is it good news or not, uurh no the arnold renderer really needs that extra ray octane and vray have,
without it noice denoise is a must have for arnold and all the games about must use render sequence cannot use gpu, lose all your other render passes for a denoise pass… lucky its just testing i would not touch it
on a real job without the big render farm.
will drive people mad


#4

looked at polycount discussion
apologies for the crap test scene, but only looking at noise really…

added
IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc
the windows default is set TdrDelay2 and TdrDdiDelay5 if you need to reset, though substance
and some wacom photoshop different settings as well.

so thought i would compare
top has 4 area light gobos and old emission set to 1, render sample aa 6 2 2 2 2 2, all aovs clamped
i7 7 thread 16gb ram cpu fr 1920 by 1080 noise free, but render too slow time 21min per frame to be usable,

same settings gpu gtx 1660 vram 6gb crashed so not usable
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc


mid has 2 mesh lights sample 1, render sample aa 6 2 2 2 2 2, all aovs clamped
i7 7 thread 16gb ram cpu fr 1920 by 1080 noise free, but render too slow time 23min per frame to be usable,

same settings gpu gtx 1660 vram 6gb crashed so not usable
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc


this one redid shaders with carpaint and spec walls and floor increased to make more useful,
had to increase arnolder render aa spec from to 3 added 5 mins render time per frame,

mid under has 2 mesh lights sample 1, render sample aa 3 2 3 2 2 2, all aovs clamped
i7 7 thread 16gb ram cpu fr 1920 by 1080 noise free, but render too slightlty slow time 10.13min per frame to be usable,

same settings gpu gtx 1660 vram 6gb crashed so not usable
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc


bot a arnold emission shader only lit, render sample aa 6 2 2 2 2 2, all aovs clamped
i7 7 thread 16gb ram cpu fr 1920 by 1080 BAD NOISE, but render too slow time 13.5min per frame to be usable,

same settings gpu gtx 1660 vram 6gb crashed so not usable
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc


conclusion still not looking at arnold for fast noise free interiors, that you could get out in a week on 4 night renders only on a low end system…it cant do it…with those render times…too slow. still way over 5mins perframe,
and arnold gpu is hopeless compared to blender which can mix gpu and cpu and wont crash and completely
disable the way arnold maya does…talking low end gpu. gtx1660 6gb vram


added 2 blender renders …to prove what we expect out of the box any gpu.

exported maya scene to default out of the box cycles,
obviously shaders not exactly the same, used only emission shader no other lights,(noisy in arnold)
used same nvidia driver as arnold, vram usage in blender was 1.8 gb, maya arnold gpu vram crashed at 2.1gb,
blender render was gpu only sample max 6000, took 1m10 secs per frame HURRAH USABLE TIME
i had all blender viewports layout and shading realtime as well, could duplicate emission geo,
AND could move and scale the bg box that caused arnold to crash…this is to prove the issue is arnolds gpu
implementation alone blaming nvidia drivers and maya version as seen by people saying their rtx 2070 or rtx 3080 card doesnt work with latest drivers and maya version, look at reddit complaints…

please see
https://www.flickr.com/photos/93465359@N03/52274598907

WITH ARNOLD NOT BLENDER
the gpu crashes all relate to a bg default maya cube that enclosed the scene with as bg walls, it uses the same shader as the floor, if i delete it the gpu render works says using 3096 gbvram, if i turn it back on arnold crashes, if i hide it and add another cube its ok for a while, then i resize that new cube crashes again…
this is what i cannot stand about arnold the instability of its gpu implementation compared with say blender which is rock solid if it runs out of gpu it mixes back with cpu, it doesnt crash and wipe out maya.

i trried upping tdr delay in windows 2 to 7 to 20, maya arnold gpu still crashes, blender never crashed once using gpu…well boss man ask me to do this test on a low end machine use blender no competition

Finally solved using
fix now IF YOUR GPU CRASHING, set TdrDelay60 and TdrDdiDelay60 as per unreal doc


#5

finally found a fix to a gtx 1660 vram 6gb crashing maya arnold gpus,
set TdrDelay to 60 and TdrDdiDelay to 60 as per this unreal doc
the windows default is set TdrDelay2 and TdrDdiDelay5 if you need to reset, though substance
and some wacom photoshop different settings as well.

https://docs.unrealengine.com/5.0/en-US/how-to-fix-a-gpu-driver-crash-when-using-unreal-engine/

gpu render now similar to blenders roughly 1m10-20s on gpu 6aa and denoise, note do not save pics in arnoldrender bar, however uses 3gbvram in maya arnold compare blender 1.8gb vram and blender has
better viewports running gpu while rendering,also the tdr fix causes wierd corrupt viewports in photoshop which blender would not cause as does not need the tdr fix.

so updated pic, maya arnold gpu render at top finally… still a dodgey test scene,
but may help someone trying to get learn gpu rendering in maya on low end cards, drivers and older versions of maya…and getting blackscreen gpu crashes.

https://www.flickr.com/photos/93465359@N03/52278817332


#6

final thoughts, and bear in mind this is a test on a maya version couple years old,ditto arnold,
you get given whatever system the vfx company has availiable, thats the bit they seem to forget.
in this case low end gtx1660 6gb vram, even nvidia driver support for the card do they really care…no…its not an rtx card.
but unless there there have been dramatic improvements, ie using an rtx card cannott test,

found the main difference between blender which is the latest version (though earlier versions also work fine on gpu,unlike maya),
and maya not latest, (whatever the vfx company wnats you to work on) is
to be the way the gpu is using your computer resources, gpu in blender is a pleasant experience,
you are not scared to do anything you like to the scene, duplicating and moving objects and updating shaders, and switching to another program that is not blender all works seemlessly, and you can easily change the viewport gpu sample on the fly…

maya arnold older version admittedly ,crashes unexpectedly, if you update your normal or bumps maps say,
is picky about ngons over 4sides,if you have a shaded viewport may have to go to bbox(compare blender running gpu in several viewports easily) , if you try to model on the gpu in blender full render it will be ok, in maya the noise is too bad and not realtime this maybe the tdr settings you had to change to make the arnold gpu work at all,…maya arnold wastes another 1.3gb vram more use than blender not sure why
trying to leave maya and work on another program is a slow process, its using all the computer resources.

learning it might be ok, in vfx company use you will need all the latest maya version , rtx cards and drivers,
to get a similar experience to blender…at the bottom end.

whats really wierd is with blender you can have exactly the same scene open as an fbx as you had in maya,
put blenders layout and shading viewport on gpu realtime max 32 denoise and sculpt in realtime, and then render gpu max 4096 in 1min 10s at the same time , while you still had maya open in the bg, while maya and arnold alone could not give a
good realtime gpu modelling viewport when it was maya running by itself, thats how good blenders viewports are on a now not so supported by nvidia gtx 1660…point is blender is an enjoyable experience not a single crash happened, maya crashed changing the normal map.

though its clear maya is here to stay in vfx houses on big scenes…hoping rtx fixes its viewports.

anyway back to that 3minute animation with a gpu gtx1660 6gbvram at say gpu render in arnold or blender cycles say you get 2mins per frame if you are lucky…you would need to be very lucky but what the heck
22460*3=8640mins/60=144hours/32hr render week=4.5weeks still not really feasible without a render farm,

to fit the 32 hour week of render time target a under 26secs per frame is needed maybe thats a top end rtx card

still havent looked at wether maya arnold or blender can denoise for animations without flickr, ie maya arnold noice issues.

blender you could render on the machine while working, maya arnold you have to buy a licence to batch render
and maya arnold, using the machine and rendering basically is a no go it eats computer resources in a way blender does not
plus both blender and maya hope it will fit in vram 6gb which is pretty unlikely.

so you are still looking at unreal or blender eevee, with again the issue will it fit into their vram.

and do nvidia set trddelay to bad settings for older cards on their latest drivers to flog new rtx cards,
seems odd otherwise.

CONCLUSION maya is a longway from producing an animation,final rendered output in a week, while you work on the same machine unless you have a say 32gb rtx card, a seperate arnold licence, and even then not sure if you can denoise the arnold render at all…or wether you will get arnold denoised passes in any form…leave that to some vfx company thats got the free$ to test out such an expensive scenario…
till then blender and unreal rules…not that that helps the freelance maya operator going round vfx companies
saying do you know redshift, octane, vray, clarisse…whys that then?


#7

well if you want an animation sample in maya arnold with NO flickering noise,
and a render time frame under 30 secs per frame, need gpu render sequence and sample 3aa,
and optix denoise, but as optix denoise is not a temporal animation denoise like “noice” the flicker actually
is slightly worse, setting indirect clamp 0.5 will reduce contrast of indirect spec fireflies,
but reduce highlight range, increase samples of lights will increase render times, making spec surfaces
value rougher helps…BUT NONE OF THIS STOPS the maya arnold gpu render sequence temporal noise flicker.
All you can do in maya arnold is render on cpu only aa 12 even then might still flicker and run noice
denoise at end which may have to use pixel sample 18 at 5 mins per frame to reduce noise flicker.
ie looking at close to 20mins per frame.

UPSHOT IS CURRENTLY MAYA ARNOLD is no good at all at rendering out student samples on
a single pc, without flickering it cannot be done unless you are graduating the following year.

i think octane and redshift use the extra ray trace to avoid this maya arnold issue altogether,
but cannot test…blender tutorials do cheats with other passes- and offset them to stop flicker, and maybe blender 3.1 has an optix denoise temporal render that MAYBE is better not tested yet,
and there is Neatevideo denoiser for nuke, aftereffects etc (expensive) not sure if this works as not tested

then you have lower quality realtime unreal and blender eevee OPTION.

ie low render times long way to go yet without a render farm or high end rtx cards.

then we have the issue of render file sizes tests on 3 secs of animation, single pass exr,s single frames are 20mb ,each…so 72 frames, 3 secs animation with a optix denoise frame is 2.88gb.
your 3 min animation is 172gb no aovs only denoise…you will be buying a lot of hard drives
using the exr format…i think the bottom end of vfx needs a compressed format that is NOT exr
to solve this.

can you afford to go the maya arnold route at all as a student…short answer for rendered animated samples
with no flicker, short answer is no,
its use really is only training software use for large vfx companies…with massive render farms

what a mess

actually way worse arnold noice needs merged aovs exrs to work ie each frame 100mb when you include noice ,
when you use low arnold aa 1 say on cpu min render time 1 min and noice cant give you a preview render you just have to plug in a pixel search and depth, and boy is it slow, same as a aa6 cpu render so kind of pointless,
ie to fix noise on lower aas maybe 15mins per frame

3 mins renders = 432gb storage


#8

While you’re right about Maya being trash, and Arnold being slowAF, rendering animations on a home computer has always, always been a challenge, and every time machines get faster, people feel the need to render bigger and with more special rendering features ( glossy reflection / GI / SSS ). Creating animations at full res full quality with GI at home, is like trying to be a roofer with no ladder.

Even at the studios where I work, which have farms, if it’s Arnold, I turn off as much raytracing as I can, and sometimes have nothing but raytrace shadows. No raytrace reflections and certainly no GI ( and set diff and spec samples to 0 ) ; then it renders in a couple minutes at 2k… about as fast as vray :smiley: It’s a lot less realistic this way, depending on what you’re doing, but if you know how to fake those you can still get a decent image.