PDA

View Full Version : Mental Ray - Millions of Polygons - Arghs!


pixelshaker
04-20-2006, 11:57 AM
Hello all,

maybe somebody has an idea how to solve my - and obviously a general mental ray problem.
I struggled with a scene with about 5 million polygons which mental ray didnīt render (maya software did it). Mental Ray aborted with messages like "cant free memory", "try -memory 618 on next run" and so on. To be sure that the problem isnīt in the scene i created a new scene with a sphere with 1 Million Polygons and duplicated it 4 times so that i have overall 5 Million quads and about 8 Million triangles.

Same problem here, mental ray hangs up with memory issues on a dual opteron with 4 Gigs Ram, even with raytracing disabled. Maya Software did it. You might say now: ok man - then take maya software. it have to be mental ray.

I think anyone can make a quick test to prove if i am right.

any ideas how to get mental ray to render huge scenes, like proper bsp settings and so on.

Needles to say that it is as always really urgent.

Many thanx in advance folks!

slipknot66
04-20-2006, 01:12 PM
Well, did you tried to adjust your BSP settings?
Try changing the Memory that mentalray its using to something around 2000 (as you said you have 4gb ram), try using Large BSP, and tweaking the Size and Depth.Change the Task Size to 16 or 8.Check this link
http://www.jupiter-jazz.com/wordpress/wp-content/data/tr4kv2/html/chapter4-MEM.html

pixelshaker
04-20-2006, 01:22 PM
I did, doesnīt change anything ! If you have a few minutes, just try the 5 spheres i mentioned.

Another problem: Mental Ray renders fine until it comes to the maya_shaderglow:

JOB 0.15 progr: 99.9% rendered on svens.15
JOB 0.7 progr: 100.0% rendered on svens.7
RC 0.0 info : rendering statistics
RC 0.0 info : type number per eye ray
RC 0.0 info : eye rays 15897448 1.00
RC 0.0 info : reflection rays 4301930 0.27
RC 0.0 info : shadow rays 1079175260 67.88
RCI 0.0 info : main bsp tree statistics:
RCI 0.0 info : max depth : 40
RCI 0.0 info : max leaf size : 1156
RCI 0.0 info : average depth : 38
RCI 0.0 info : average leaf size : 12
RCI 0.0 info : leafnodes : 360284
RCI 0.0 info : bsp size (Kb) : 22121
PHEN 0.0 progr: calling output shaders
PHEN 0.0 progr: maya_shaderglow(): Computing glow...
MEM 0.0 info : allocation of 69686800 bytes in C:\engserv\rbuild\101\client\Ma
ya\src\MentalRay\mayabase\src\mayaglow.c line 559 failed: flushing
MEM 0.0 info : try '-memory 845' for future runs
MEM 0.0 progr: scene cache flushed 6 MB in 0.00s, now: 839 MB
MEM 0.1 progr: scene cache flushed asynchronously 165 MB for module JOB, now:
674 MB
MEM 0.0 info : allocation of 69686800 bytes in C:\engserv\rbuild\101\client\Ma
ya\src\MentalRay\mayabase\src\mayaglow.c line 559 failed: flushing
MEM 0.0 info : try '-memory 673' for future runs
MEM 0.0 info : allocation of 69686800 bytes in C:\engserv\rbuild\101\client\Ma
ya\src\MentalRay\mayabase\src\mayaglow.c line 559 failed: flushing
MEM 0.0 info : try '-memory 673' for future runs
.
.
.
MEM 0.0 info : try '-memory 673' for future runs
MEM 0.0 info : ================ memory error post-mortem ================
MEM 0.0 info : -------- memory summary --------
MEM 0.0 info : module max KB maxnblks KB nblocks %bytes
MEM 0.0 info : MSG 2 1 2 1 0.00
MEM 0.0 info : MEM 6140 12 5628 11 0.82
MEM 0.0 info : DB 843135 2746 557191 3094 81.05
MEM 0.0 info : PHEN 89435 13 88922 6 12.94
MEM 0.0 info : RC 297934 4023 0 0 0.00
MEM 0.0 info : IMG 202078 6 35389 2 5.15
MEM 0.0 info : MI 37 3 0 0 0.00
MEM 0.0 info : LINK 22 11 22 11 0.00
MEM 0.0 info : JOB 297 260 287 259 0.04
MEM 0.0 info : GEOMI 25705 89 0 0 0.00
MEM 0.0 info : TRANS 12 3 12 3 0.00
MEM 0.0 info : LIB 56 14 56 14 0.01
MEM 0.0 info : API 90755 1681 0 0 0.00
MEM 0.0 info : RCI 20482 1729 0 0 0.00
MEM 0.0 info : GAPPO 9639 89 0 0 0.00
MEM 0.0 info : JPG 133 18 0 0 0.00
MEM 0.0 info : total 687458 3362 100.00
MEM 0.0 info : max heap memory approx. 846 MB, now approx. 674 MB
MEM 0.0 info : -------- database summary --------
DB 0.0 info : database elements by module:
DB 0.0 info : jobs size #jobs size #nonjobs module
DB 0.0 info : 0 0 36458 231 DB
DB 0.0 info : 972 1 32 2 PHEN
DB 0.0 info : 0 0 35925 120 SCENE
DB 0.0 info : 0 0 28012 4 RC
DB 0.0 info : 6780 226 0 0 GAPMI
DB 0.0 info : 36238720 3 173629018 2648 IMG
DB 0.0 info : 0 0 396518197 1153 API
DB 0.0 info : 0 0 40 1 RCFG
DB 0.0 info :
DB 0.0 info : database elements by type:
DB 0.0 info : jobs size #jobs size #nonjobs type
DB 0.0 info : 0 0 93604 261 1 funcdecl
DB 0.0 info : 0 0 17388 58 2 func
DB 0.0 info : 0 0 572 13 3 material
DB 0.0 info : 0 0 196 1 4 light
DB 0.0 info : 0 0 444 3 5 camera
DB 0.0 info : 0 0 28024 113 7 object
DB 0.0 info : 0 0 101415 345 10 instance
DB 0.0 info : 0 0 520 1 11 group
DB 0.0 info : 0 0 3232 4 12 options
DB 0.0 info : 36238720 3 173628848 2646 13 image
DB 0.0 info : 0 0 19903100 113 14 polygon
DB 0.0 info : 0 0 15923016 113 16 geoindex
DB 0.0 info : 0 0 60431912 113 18 geovertex
DB 0.0 info : 0 0 300083304 113 20 geovector
DB 0.0 info : 0 0 2736 241 28 tag
DB 0.0 info : 0 0 687 9 29 string
DB 0.0 info : 0 0 144 2 34 userdata
DB 0.0 info : 972 1 0 0 41 local
DB 0.0 info : 6780 113 0 0 42 splitobject
DB 0.0 info : 0 0 612 4 49 fb_info
DB 0.0 info : 0 0 608 1 67 lights_db
DB 0.0 info : 0 0 40 1 88 fg_session
DB 0.0 info : 0 0 32 2 90 func_ref
DB 0.0 info : 0 0 27248 2 105 rc_fb
DB 0.0 info :
.
.
.
mental ray: out of memory
MEM 0.0 fatal 031008: can't allocate 69686800 bytes.
MEM 0.0 fatal 031008: can't allocate 69686800 bytes.
MEM 0.0 fatal 031008: can't allocate 69686800 bytes.
MEM 0.0 fatal 031008: can't allocate 69686800 bytes.
MEM 0.0 info : cleaning up memory mapped frame buffers


________ :(

How can i disable the shader_glow ?

slipknot66
04-20-2006, 01:31 PM
Ok are you rendering from the Maya interface or command line?
To disable the shader glow go to the render Settings---mentalray tab---Translation and uncheck Export Post Effects.Under Performance check Export Objects on Demand.
Under Memory and Performance use Acceleration method--large BSP, and Memory Limit 1500.
See if it works..

pixelshaker
04-20-2006, 01:48 PM
i rendered the sphere tests from within maya. my serious jobs from command line. I did what you told me and the spheretest from within maya donīt crashes though half of the picture leaves blank so i will try from command line !

you are the man, thanx thanx a lot. i keep you and all other up to date, i think this is a very interessting thread!

pixelshaker
04-20-2006, 02:15 PM
hm ok the problem is still there:

mental ray: out of memory
MEM 0.9 info : allocation of 119952168 bytes failed: flushing
MEM 0.9 info : try '-memory 223' for future runs

any ideas ?

Stucky
04-20-2006, 03:29 PM
What OS are you using? in Windows XP Pro 32bit you can only use 3GB of RAM, so, if you are using this, you dont fully use your 4GB. Anyway, in a 64bit OS, you can only use more than 3GB if the application you are using is also 64bit. In our case, Maya is a 32bit, still, so, you can only use 3GB with Maya. So, you could try use that "-memory value" flag, in the render command(once you are using the render command). I think you are using it...

And did you change the values in the Sample Defaults, under the Mental Ray render globals?


EDIT: in the end of the day I will try to render a scene like that here at work. We have a Dual opteron 64bit, windows xp pro 64bit and 8GB of RAM, and we have seats with Linux too. Unix based systems handle much better the memory. I'll tell you something later...


-S

pixelshaker
04-20-2006, 03:50 PM
I set everything according to slipknot.

That memory flag is for mental ray standalone i think, mayabatch doesnīt know it.

pixelshaker
04-20-2006, 03:50 PM
EDIT: in the end of the day I will try to render a scene like that here at work. We have a Dual opteron 64bit, windows xp pro 64bit and 8GB of RAM, and we have seats with Linux too. Unix based systems handle much better the memory. I'll tell you something later...


-S

Great, Thanx!

Stucky
04-20-2006, 05:26 PM
Well! the render aborted. But i dindt try to optimize MR. I just created the spheres(5 million polys total), and render with MR default options...:(


-S

pixelshaker
04-20-2006, 06:02 PM
sounds frustrating.. :(

tfritzsche
04-20-2006, 07:05 PM
Hi PixelShaker,
Try setting the memory limit in Mentalray>Memory&Performance>memory limit to zero.
This will force mental ray to use disk space for ram, your renders will slow way, way down, but it will render (provided you have the disk space).
hope this helps
thomas

Stucky
04-20-2006, 07:34 PM
You can eventualy use render layers, and render parts of the geometry separately, and in the end, mix it all together...


-S

tfritzsche
04-20-2006, 07:57 PM
Hi Pixelshaker,
if you put zero into render globals->mentalray->memory and performance->memory limit
it will force MR to use swap space for ram - slowing down your render, BUT you will get the render finnished (provided you have the disk space).
Hope this helps
thomas

francescaluce
04-20-2006, 07:59 PM
first of all it is not mentalray. as here I rendered 8 balls each
2mils of triangles. with a 600mb memory allocation with the
standalone. so you should say maya2mentalray.

second I would look at the scene instead to look at renderer.
and not eventually... but in the first time.. 5milions poly are a
big effort for any renderer outhere. I'd look at how to separate
the scene in passes, how to optimize the geometries and only
then I'd look at the renderer starting to optimize its memory
and scene management.


ciao
francesca

Bonedaddy
04-20-2006, 08:22 PM
If you have MR standalone, you can export the object as a .mi and bring it in as an include, which will save you from the mayatomr translation time.

MaNdRaK18
04-20-2006, 08:59 PM
[QUOTE=francescaluce]first of all it is not mentalray. as here I rendered 8 balls each
2mils of triangles. with a 600mb memory allocation with the
standalone. so you should say maya2mentalray.

/QUOTE]

But what possible could they do so wrong with maya2mentalray, that it has so many problems translating scenes to renderer?!
And how is possible at all, that maya2mr can't render something that maya software rendererer can?!?

Bonedaddy
04-20-2006, 09:12 PM
But what possible could they do so wrong with maya2mentalray, that it has so many problems translating scenes to renderer?!
And how is possible at all, that maya2mr can't render something that maya software rendererer can?!?


Mayatomr is translating everything in the scene to a .mi file, in Mental Ray language. This is an step that takes extra memory and processing, which isn't needed in Maya Software rendering. You have to realize that you're pushing your computer to the limit, and that extra step is what's killing it.

I'm not sure exactly what mayatomr is doing, but I am guessing that the translation, when it's not MR standalone, is loading a limited version of maya, loading the maya scene file into memory, and possibly loading the mr scene file into memory, as opposed to writing it out to disk, line by line. I'd love to understand more of what it's doing, esp. in straight MRfM vs. MR standalone, so if anyone could chime in with more info, please do so.

slipknot66
04-20-2006, 09:30 PM
Thats why its always a good idea to use mr shaders only, so theres no translation. Also you can uncheck some of the features that you are not using, so the plugin will not waste time translating something you dont need.

Stucky
04-20-2006, 09:57 PM
would it be a good thing to also use an OS that handles better the memory?!?!? Like an Unix based system? sometimes Windows just sucks, if not all the times...heheh..


-S

Bonedaddy
04-20-2006, 10:07 PM
Thats why its always a good idea to use mr shaders only, so theres no translation. Also you can uncheck some of the features that you are not using, so the plugin will not waste time translating something you dont need.

I am pretty sure the geo conversion is what's killing him, not the shaders...

MasonDoran
04-21-2006, 09:02 AM
sounds like you will need to do test renders of increments of 1million to .5 million polys to find the limit of your machine. And then on top of that start doing BSP tests to see how much more you can get out of it.

MasonDoran
04-21-2006, 09:05 AM
On top of this question.....


What else will cause MR to quit besides polygons? You mentioned shaders being translated, but if FG or GI is enable will it kill the renderer even if the geo/shader conversion is ok?

Also, is there a way to globally toggle all Displacements just in case the poly count is getting to high?

pixelshaker
04-21-2006, 09:32 AM
first of all it is not mentalray. as here I rendered 8 balls each
2mils of triangles. with a 600mb memory allocation with the
standalone. so you should say maya2mentalray.

second I would look at the scene instead to look at renderer.
and not eventually... but in the first time.. 5milions poly are a
big effort for any renderer outhere. I'd look at how to separate
the scene in passes, how to optimize the geometries and only
then I'd look at the renderer starting to optimize its memory
and scene management.


ciao
francesca

Hello francesca,

always good to know, that you are out there. You are right, 5 mil. poly is high, but not that hi that a high class renderer like mental ray shouldnīt handle it. i guess like you too that maya2mental is the problem, unfortunately we donīt have mental ray standalone :(

Reducing the model is in this case unpractical cause we are getting the modells in and sometimes the customer sits a few hours later in our neck ;) But if standalone is that much "better" we have to think about it!

floze
04-21-2006, 09:41 AM
would it be a good thing to also use an OS that handles better the memory?!?!? Like an Unix based system? sometimes Windows just sucks, if not all the times...heheh..


-S
You dont have that problems on linux in my experience, at least not that severly. You'll get the full 4GB for the whole application, note that this limit is at around 2GB on windows - even with the /3GB switch enabled. Once you reach that limit on linux you might run into the same problem though ('cant allocate memory'..). Often you can avoid to hit that wall by properly using the cache flush.

I rendered 1.603.373.184 triangles that way:
http://forums.cgsociety.org/showpost.php?p=3465479&postcount=11

In your case, I guess the main problem is that mentalray is choking on heavy geometry being put into too few objects, thus it cant be flushed during rendering. Also you might get it rendered if you try to instance the sphere, instead of copying it.

Stucky
04-21-2006, 10:04 AM
Yes floze you are right. But the spheres where just a test that pixelshaker did. It is not his "real work". So, for the test of the spheres, we could really use instances, but for what he really wants, he probably cannot use instances...So, Pixelshaker. Perhaps if you try to do some test renders on Linux or Mac, it would be great for you...


-S

slipknot66
04-21-2006, 11:05 AM
Intersting thing, its that i could render a scene with 2 millions poly with an Athlon Xp 1.6ghz 512mbram, so he should be able to render his scene with a computer like that. I think its something wrong with his windows, maybe trying to render with other programs running in the background..

kimaldis
04-21-2006, 11:17 AM
mental ray is capable of rendering pretty big scenes these days but it is a renderer that needs optimising. Rendering millions of polys out of the box doesn't always work and it's rare to find a large scene that doesn't require some tweaking. The usual places to start are witht the BSP settings and the memory limits. I usually just mess with them on a small area until the render time drops to a minimum. Others have more rigorous methods.

That said, it's possible there could be other stuff going on. I've just checked out a trial scene in XSI and I can render close to 22 million polygons with default settings before it starts to go pear shaped on a laptop with only 2Gb of memory. It takes around 90 seconds at 720x576, Pentium M 2Ghz. It'll probably go to more with some tweaking. I'm not sure if there's any difference in the mental ray versions here between xsi 5.1 and Maya but that may be a contributing factor.

wlidea
04-21-2006, 12:40 PM
hi. set Memory Limit->Task Size to 8-16.
Translation->Performance->Export Objects On Demand to ON set Threshold to 0-30.and Export Triangulated Polygons to ON.

Hope may help you.

pixelshaker
04-21-2006, 04:18 PM
Hello,

i just wanna say thank you all for your help!

Best regards,
Andi

bjoern
04-21-2006, 11:59 PM
@pixelshaker
it seems we want the same thing :)
http://forums.cgsociety.org/showthread.php?t=348782

Xcid
09-30-2006, 12:06 AM
just want to say i have tried every possible thing to get my MR render to work, all the 3 seperate parts work fine on thier own, the scene and the two charactors , one with displacement map. FG and animation.


The scale of my scene is really big and cant be scaled down as the rig gets funky when scaled.
If scale really does effect MR renders.

I mean everything from increasing 100% -300% , increasing mem usage, BSp settings, export tri, eport obj on demand...everything.

turns out it renders fine by just clicking on the file and rendering , not rendering from within maya.... so try that if all else fails and you really do need that Mr-support-group.

Spacelord
09-30-2006, 01:52 AM
just want to say i have tried every possible thing to get my MR render to work, all the 3 seperate parts work fine on thier own, the scene and the two charactors , one with displacement map. FG and animation.


The scale of my scene is really big and cant be scaled down as the rig gets funky when scaled.
If scale really does effect MR renders.

I mean everything from increasing 100% -300% , increasing mem usage, BSp settings, export tri, eport obj on demand...everything.

turns out it renders fine by just clicking on the file and rendering , not rendering from within maya.... so try that if all else fails and you really do need that Mr-support-group.

I find most times its the displacement map.
Try turning it off and see if it is, if it is adjust the displacement settings.

Xcid
09-30-2006, 12:19 PM
hey F.D,

yeah just trying that this morning, really killing myself over this...FG is just so unstable with memory i cant stand it anymore but cant afford not to use it! I really feel like i have read eveyr single post , tried everything.

All the scenes work on thier own with no errors, there are no geo errors or anything like that.

my scene geometory vert count is 445742 with the smoothed charactor meshes.

my Scene size is really big and cant be scaled...from corner to corner is 417 with distance tool.

Even with low FG settings it gives me that "mental ray: out of memory
MEM 0.2 info : allocation of 67241480 bytes failed: flushing
MEM 0.2 info : try '-memory 328' for future runs"

i have tried with just one spot in my scene...with G.I photons @ 150000
I dont get it. Tried BSp settings, Tried Large BSp, tried Mem Limit @ 0, tried Triang , Obj demand, reaytracing is very low...1 ,

Gi accuracy at 100 and radius at 12, 10000 photons... even crashes.

Running Xp. P4 3.0 , 2 gigs ram.


Seems my scene is actually pretty stable without the charactor in it. She is a smoothed polymesh. only renders sometimes without the displacement and texture, but still its not stable. With the 2nd charactor it wont render at all but no geo errors. the second chractor has no textures, just high poly, no displacement map.

NordicPolygons
10-08-2006, 10:53 AM
Maya 8 x64 seems to handle polygons somewhat better. 5 million polygons takes 1 minute and 20 seconds to render with mental ray on my Athlon64 4200+ and 1gb RAM. However it crashed halfway when trying to render 12 millions : /

Xcid
10-08-2006, 12:28 PM
wish it handled SubD better ...all my mental M.R issues where because of bad subD meshes, i have to convert all the subd in the scene to poly which i`m not happy about

tharrell
10-12-2006, 04:46 AM
I've run into the mental ray translation issue on several of my Win workstations, and I did some digging.

Apparently there's an additional mentalray disk swap you can activate that's hidden from your Maya globals until you invoke it with some MEL.

Anyhow, I compiled a bunch of MR memory tweaks for Maya 7/8 into a script that does a bit of error checking/garbage collection (script zipped and attached).

You might try running this and setting the swap file large-ish (4096). You'll take a performance hit, but it might do the trick for you.

Hope this is helpful for someone,

--T

Description from the script's header:


// Script to unlock hidden mentalray memory flags in Maya 7.0 and 8.0's render globals
//
// Credit for the techniques included go to Barton Gawboy and Coder on the lamrug forums
// http://www.lamrug.org
//
// I'm just wrapping these up, adding some basic state checking and making it a clickable function
// Trey Harrell -- www.treyharrell.com
//
// Source the script and execute:
//
// th_mrAdvancedMemGlobals;
//
// The script will add swap directory, swap limit and Maya memory zone overflow attributes to
// mentalray's render globals node, then select the mentalray globals in your attribute editor.
//
// The memory zone setting already exists under the mental ray globals in Maya 8, and the script
// will tell you that the attribute already exists.
//
// If these attributes already exist in the scene, the script will simply open the globals in
// your attribute editor.
//
// Note that these attributes are not visible in the normal render globals window, only in the
// Attribute editor when you're viewing the mentalrayGlobals node, and you must have edited the
// mentalray render globals/settings at least once for the globals to exist.
//
// The swap limit is set in megabytes, and I've set the default for 2 gb maximum.
//
// Also note that modifying the memoryZone values is rather dangerous, and the default of 110%
// is the safest value to use!

acidream
10-13-2006, 07:09 AM
This page about Mental Ray memory optimization helped understand how to better tweak the settings for reduced memory.

http://www.jupiter-jazz.com/wordpress/wp-content/data/tr4kv2/html/chapter4-MEM.html

I've been working on a 3d map with displacement that's right at 8 million polys and it renders in 1:40 seconds with around 425MB memory usage.


scott

CGTalk Moderation
10-13-2006, 07:09 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.