PDA

View Full Version : Memory issue on very large scene...


Marcos_Aurilius
08-19-2004, 10:22 PM
First of all, hello to everyone.

This is my first post in cgtalk. I registered just because I came across this problem, and you seem to be one of the largest comunities I am aware of.

I work on Maya since v4.0 (currently on v6.0) and a few days ago I thought about creating a very large scene, for the 1st time. So I started building a stadium. After 7-8 days of modelling, I have a 154MB .mb file which contains the stadium's field and the lines, all the surrounding tiers/stands (not sure about the word) and currently ~24,000 seats (out of ~30,000 - 68 faces each). Total faces of the scene ~640,000, dimensions ~600x500 maya units. Without the seats, the .mb file is ~10MB.

My configuration is a 2000+ XP (@2300+), 1GB of PC2700 DDR ram, and 1,5GB of page file. Till the point the stadium had ~15,000 seats, I was able to render at 1024x768 with acceptable quality, on mental ray:
- min samples 0, max samples 2, jitter on
- raytracing 5, 5, 10, 2
- fg rays 5000, min radius 0,05, max radius 2,500 (I know the "rule" of 1% and 10% of scene size)
- BSP size 20, depth 25 (pretty good based on the average values, exported by mr)
- task size 128.
Since the day I placed the extra 9,000 seats (and made a total of 24,000), mental ray hasn't been able to conclude the fg points calculation, not even once, even with much lower quality settings. I always run out of memory and page file (obviously - they get filled during the frame translation, even before mr starts the fg process) and mayabatch.exe crashes (Visual C++ Runtime Library - Runtime Error! - The application has requested to terminate it in an unusual way... etc.) somewhere between 20 and 50% of the fg process.

I have tried (in this order):
- using a finalgMap I had from my previous successful renders
- batch rendering from within maya (instead in render view) - that seemed to make things a little better
- command line rendering
- creating render layers and rendering just one of them (~8,000 seats and nothing else).
All the above failed, and the reason seems to be the memory getting filled up during the frame translation. If you've made it to read all the stuff I wrote, I would really appreciate any idea that would save me from replacing 24,000 seats with even "lighter" versions (the first one I modeled was 450 faces, but I gave it up soon enough :) )... Sorry for the huge post.

mrgoodbyte
08-20-2004, 02:50 PM
If all of your seats are seperate objects you can try combining some of them to one object. The conversion from Maya to MR format includes the conversion of render settings such as motion blur for each and every object. By combining them into one object (or several objects) less of these values are converted and stored in memory. Might save you a couple of Megabytes.

Ronald

Marcos_Aurilius
08-20-2004, 03:48 PM
Thanks for the idea. I 'll try and see what happens.

Zac256
08-20-2004, 04:39 PM
First of all, be sure to use instances and not copies of the seats considering they are all the same. That won't take up so much memory per seat. Then be sure to use a LOD (Level of Detail) node so that you don't have to render them full poly at high distances. Then get more memory.


Or, even better combine what mr goodbyte said and what I said and create instanced sections of seats that are combined and LOD. Should work quite nicely.

-Zac

Marcos_Aurilius
08-20-2004, 09:20 PM
Thank you both. I tried combining some tiers and seats in larger objects and the size of the scene gets smaller. But since it's a process that will take me a while, I can't say anything yet about rendering.

Zac256, the LOD node you mentioned sounds pretty nice (hadn't heard of it until now, because I don't usually model very large scenes). I'll see what I can find about it. Thanks again.

Marcos_Aurilius
08-20-2004, 09:48 PM
One more thing (I would edit my post but I don't see it yet - queued for approval :D): I have also tried instancing the seats (instead of duplicating them - yes, they are all identical), but that seemed to make interaction with the scene (a.k.a rotation, moving) in the panels much slower. Probably it doesn't sound right, but...

Marcos_Aurilius
08-23-2004, 11:29 PM
The grouping trick did it! After combining the tiers to larger objects, and the seats to objects of 800-1000 each, the scene rendered again. The scene size after this procedure went down to 135MB (from 158MB), the initial reading time reduced to 8 secs (from 45-50) and the free RAM during render went up to 330MB (from 5-10). That was very nice :thumbsup: .

On the other hand, feeling that there is room for further improvement, I searched - without any good results - both maya and mr reference (and of course google) for the LOD node Zac256 mentioned. Could anyone give me any directions - like a link or somenthing - to this node? Thank you...

Edit: I found "Level of Detail -> Group/Ungroup" command. Is this the one mentioned above? This one seems to require me to produce several versions of each mesh, and define distances at which each one will be rendered. Though it would be time saving for someone who has a timeline, I need to maintain the scene size as small as possible and have no time restrictions (I batch-render overnight). So I will have to stick just to combining. Of course, anything else that would further improve the above situation is welcome. Thanks again.

Zac256
08-25-2004, 02:59 AM
Yes, that is the level of detail node specified. If you don't want to use it that's up to you. It really only helps during the rendering process.

andybiz_2004
08-25-2004, 11:38 AM
I read your query & the replies. Looks like you're making good progress on optimizing your scene for rendering.

Render=>Render Diagnostics. This will test your scene file to see if there're any offending or redundant curves, planes, materials, etc that will not render your scene properly. You can view a full list of error messages in the script editor.

If there're error messages, just use the mouse to highlight offending curve, plane, etc, press CTRL+C to copy it. Minimise the Script Editor & paste (CTRL+V) this text into the textbox to the right of the word "sel". Then press Enter.

Now move your mouse over the scene window & press "f" to focus on the offending curve etc. From there, based on what the error message says, you can rectify the problem.

After most of the error messages are debugged, Optimize the scene by selecting File=>Optimize Scene Size. Then you're ready to render. Good luck on your project.

Andrew

rollmops
08-25-2004, 01:49 PM
- BSP size 20, depth 25 (pretty good based on the average values, exported by mr)

Hi Marcos_Aurilius,

Those values should be good if the average values exported by mr are ~25% lower.
Otherwise you'd have to raise them.
Did you try a bspdepht diagnostic? Can you see a proportion of red,green,bleu depending on the complexity of the geometry?

Marcos_Aurilius
08-25-2004, 06:33 PM
Thanks for the reply. First of all, my current values for BSP size and depth are 25 and 30 (while the latest average ones I get from mr are 22 and 27 - if I remember right).
As for the diagnostic, I ran it once in the past for another scene - but I can't say I understood much (maybe I didn't study the docs well , but I couldn't translate these color lines). I'll look through it again.
andybiz_2004: thanks for the tip. I have already run my scene through Render Diagnostics and no problems where reported.
Update: the stadium now has ~29,000 seats, and untill this afternoon, it again refused to render. So after doing some tests to find what looks good enough, I did Polygons->Reduce to all of my seat meshes. I reduced by 25%, with a tolerance of 0.025, keeping the hard lines. That dropped my scene size from 163MB (which was this afternoon's size) down to 111MB, without any render-obvious decrease in the quality of the seats (which means bad modelling by me in the begining :rolleyes: ). Later this night, I'll see if this procedure helped the rendering process...

Soyseitan
08-27-2004, 12:17 PM
:applause: :applause:

Another tip for all of you who're experiencing out of memory errors or just fatal error crashes while rendering large, complex scenes : Decrease your memory limit ! I tried different settings now for a particular scene, memlimit was first at -1, maya crashed with out of memory or fatal error, then I set it to 800 mb (system has 1 gig ram) and Maya still crashed. After awhile decreasing the limit to 550 MB it solved the problem. During rendering windows still has 50 -100 mb free ram and that really seems to help Maya to complete the Mental Ray render. Only downside is that the render takes abit longer to complete, since the memlimit forces it to flush itself to stay under 500 MB. Cheerz !

(I'm so happy, after a week of frustration with a crashing Maya, I finally think to have solved to problem)

:D :D

Marcos_Aurilius
08-27-2004, 12:59 PM
Yes, you're right. Having Task Manager opened next to the command prompt window, during the render, I have seen that I get crashes, whenever my available physical memory can't rise from 4-5MB which is the value right after the scene translation. With my current settings (and the limit still set to -1 which, by the way, means 800MB for mr), after the scene translation, my free RAM starts rising, until the start of the actual rendering process, to a size of 300MB. Changing this limit, will be my next try, if I get once again to a point that I can't render...

BillSpradlin
08-27-2004, 04:28 PM
A value of -1 means use all available ram, not 800mb. If you set a limit on the ram usage, mental ray will flush the uneeded cache when it approaches that limit, ensuring that you don't run out of ram and the scene will continue to render.

Marcos_Aurilius
08-27-2004, 11:13 PM
From the Maya docs:
Memory Limits

Physical Memory

Physical Memory defaults to 800 in mental ray for Maya. A guideline is to set it to 80% of physical memory. (In this case, we assume 1 GB. If you have more, increase this value accordingly.)


I may got this wrong, but since the default value is -1, doesn't -1 mean 800MB? It's not important enough to argue with anyone. Just to get this clear...

BillSpradlin
08-28-2004, 01:26 AM
Back in 4.5 and 5.0 mental ray defaulted to -1, which was the setting to use all available memory. In the 5.0.1 patch they fixed that to default to 800. So since then it now defaults to the correct setting, whereas before it was set to use all available ram and it wasn't flushing the cache appropriately unless changed by the user. Hope that helps.

Soyseitan
08-28-2004, 05:24 PM
Addition to the running out of memory issue : Setting the memory limit works great...if your scenecontents don't exceed this limit. If it does exceed this limit, Mental Ray will try to flush itself to reach this limit, which it obviously can't with such a large scene. Then Maya will either crash or the flushing will go on and never reach an end.
I'm trying to get the right balance between the memory limit and putting all the different stuff into different renderlayers and see if this pays off.

Soyseitan
08-28-2004, 06:32 PM
K, scene which contains 8 very highpoly trees (incl. modelled leaves), large complex grass fields (painteffects->poly to get it to render in Mental Ray), a garden in the center with +/-20 bushes (with leaves, but not with the detail of the trees) and a castle with smaller buildings around it rendered....fine ! Took half an hour with very minimal MR settings(100 fg rays, mimimum raytraced shadows etc), but that doesn't matter...I finally got Maya to render the complete scene with everything in it. Hurray !

Marcos_Aurilius
08-28-2004, 09:57 PM
If it does exceed this limit, Mental Ray will try to flush itself to reach this limit, which it obviously can't with such a large scene. Then Maya will either crash or the flushing will go on and never reach an end.You 're right. I noticed the exact same thing yesterday night. In my case, flushing kept going on and on, preventing the fg points computation to continue... Leaving it to -1 renders my scene just fine (10,000 rays - maybe many, min 0.3, max 5).
BUT. Since I started rendering from command line, I don't get the MayaRenderLog.txt created, so I can't check render times and re-adjust my other settings (like bsp). Is there any way to have this report saved like when batch-rendering through Maya? Thanks.

BillSpradlin
08-28-2004, 11:08 PM
You should also consider rendering in layers. I comp everything together and it saves so much time on rendering. In addition to that, you'll have complete and absolute control over every element in the scene. What if you wanted to adjust the color on some of the bushes and not the entire scene? You would have to re-render the whole thing.

Soyseitan
08-29-2004, 09:05 AM
Jep, in theory...after one attempt where the groundplane and finalgather sphere (with a skytexture) only partially rendered and a second attempt where everything rendered fine, now the whole scene renders completely black. Weird stuff.

Soyseitan
08-29-2004, 09:33 PM
Next update : I noticed when I rendered a frame, Maya would either crash or use almost all available memory. When I batch rendered an image, Maya's used memory was totally flushed and all memory became available to the batchrender.
Maya uses +/- 200 Mb to startup and uses another 200/300 MB for the big scene. This makes a total of 400/500 MB, combined with the 200 Mb which WindowsXP uses, the imagerender only has 300/400 MB of free memory to begin it's task. When I batchrender the image, a whopping 800 mb is available for the batchrender, of which it only uses 500/600 MB.

Now I try to render a batchsequence, first render renders great but then it stops with the message it has completed. That;s weird, it starts with the 2nd frame (of 150), but stops with the previous message after awhile. Weird stuff, again.

BillSpradlin
08-30-2004, 04:31 AM
I don't ever batch render from the GUI, I always render using the command line. Don't be afraid to use it, just get in there, read the help on it and render away. Batch rendering from the GUI always seems to have quirky issues, plus rendering from the command line will go much faster than rendering from the GUI.

Soyseitan
08-30-2004, 09:32 AM
Bill, you're the man ! Batchrendering from the commandline seems to do the trick so far....it not only renders all the images so far, but it even seems to do it indeed quite alot faster. Rendertime per frame went from >1 hour a frame to about 25 minutes. Well not only because of this, but i tweaked the scene some more. Nevermind, fact remains that the scene still renders. Hm I wonder if the memory limit/BSP settings really matter with the commandline rendering...it seems so stable so far and renders perfectly !

Marcos_Aurilius
08-30-2004, 04:40 PM
Hm I wonder if the memory limit/BSP settings really matter with the commandline rendering...I would say it matters as much as if you rendered any other way. And just not to open a new thread, I will repeat my last question:
Since I started rendering from command line, I don't get the MayaRenderLog.txt created, so I can't check render times and re-adjust my other settings (like bsp). Is there any way to have this report saved like when batch-rendering through Maya? Thanks.

mrgoodbyte
08-30-2004, 09:49 PM
Just look in your terminal/commandline, it's outputted right there. The bad thing is that the command line has a fixed number of lines it remembers. So if your level of verbosity is quite high it's very possible you'll miss the information.
Although you can increase that number it is better to append the following to your command " >> C:\mayarenderlog.txt" or similar without the quotes of course. Basically it redirects the output of the command issued to this particular file. This however will not output anything into the command line, just to that file.

In fact you can do more fun stuff with this principle if you're running from a *nix type command line which includes a version of egrep. You could add "| egrep -i BSP" to the command to just get the lines from the batch renderer being printed concering the BSP settings.
So you could end up appending your commands similar to this: "| egrep -i "BSP|time|writing" >> ~/maya/MayaRenderLog.txt". This will output just BSP settings, data concerning time and written output file to file ~/maya/MayaRenderLog.txt
If you have the command "tee" available you can use "| egrep -i "BSP|time|writing" | tee ~/maya/MayaRenderLog.txt". This will output to file AND to terminal.

There are lots of options available!

-Ronald

Marcos_Aurilius
08-31-2004, 09:44 AM
First of all, thanks for the reply.

Now, I run WinXP in my machine so the info of the last paragraph is not useful to me (to others of course it could be, so well done).

Of course, when I type the command to start the renderer, I see the output in the command prompt window, and that's useful to see whether the renderer faced any start-up problems. But, I only use (well, till now that is) cl rendering for this very large stadium scene, I am working on. This scene renders overnight, and when it's done, the window closes (maybe that's because I type the command in Start->Run, and not directly into a new command prompt window - will check it out), so I can't read anything. Does this command " >> C:\mayarenderlog.txt" work for Windows command line? - I will try this one too.

By the way, the stadium is almost finished. I placed all 37,515 seats, I created the roof and the metal pylons, and whats left is the ad boards and a few other stuff. But trying to render an image from the roof looking down (which reveals almost all meshes in the scene), I ran out of memory again. So, today I'll be playing with LoD and see what I get...

mrgoodbyte
08-31-2004, 02:07 PM
In the best thing you can do is to use the command line itself, in which you can use the >> C:\renderlog.txt.
So use Start->Run and type in "cmd". Then type your command in there.
<geek>Because cmd is being loaded as the parent process any sub command will be loaded as child processes, which in turn will output to the parent process'. Therefore whenever one or more child processes finishes the output is still available to the parent process. Same thing happens when you launch a batch render from the Maya gui. In this instance Maya is the parent process instead of the Windows command line.
Having said that the easiest thing to do is to create a regular text file and type in your command for batch rendering right there. Save the file as <something>.bat. The power of this is you can include as many as render jobs as you want and they will be rendered sequentially. Just make sure every job is on a seperate line.
Now with this .bat file created in a convienent location (e.g. C:\renders.bat) you can run this batch file from within the command line by typing in C:\render.bat
</geek>

Soyseitan mentioned that XP is using 200MB of RAM from scratch. You may want to shut down several of your background applications. I've had 2000 running on a mere 60MB and XP on 80MB. This case it will free up some more RAM. Also restarting your computer before rendering helps alot.
If all else fails you might want to consider breaking up the seates into several layers (e.g. first ring, second right, sky boxes). This is probably the way to go. Aside from getting more RAM of course.

-Ronald

Soyseitan
08-31-2004, 03:15 PM
WXP used around 120 mb when I had 512 ram. Since i upgraded to 1024 I have usually around 800 MB free memory. Only when I start, use and close Maya the free amount of RAM reaches the 870 MB mark. It only helped alittle bit to get into the services part of WXP to get rid of all the unwanted background services + get rid of for instance Norton Antivirus to start up (with 4 backgroundservices standard). Now the total amount of services are down to 17 (+2, soundvolume & deamontools) when I start up, and I can't see how you could get WXP to use only 80 MB of ram, since here it looks like the bare minimum is around 130 MB.

BillSpradlin
08-31-2004, 04:21 PM
http://www.tweakxp.com/display.aspx?id=114

Soyseitan
09-01-2004, 01:58 PM
Thanx, found an extra 10 services which could be disabled by default.

Marcos_Aurilius
09-01-2004, 11:26 PM
Ι have tried rendering the (almost finished) stadium just with fg, just with a directional light and gi, and with neither fg or gi. Nothing works (DoF wouldn't do any good - all the above tries were carried out, using the lowest-poly versions of the seats meshes I created). So, I stop trying for now (as I am joining the Greek Army on Monday), and hopefully I will render the scene when I come back and buy another 2GBs of RAM :).
I learned some new tricks though, and I would like to thank you all, by posting at least some wireframes of the model that bothered me for the last 2 weeks:

http://briefcase.pathfinder.gr/download/Marcos_Aurilius/15827/308477/0/Maya-Stadium-wireframe-4.jpg

These are the links to the other 3 images:
- http://briefcase.pathfinder.gr/download/Marcos_Aurilius/15827/308474/0/Maya-Stadium-wireframe-1.jpg
- http://briefcase.pathfinder.gr/download/Marcos_Aurilius/15827/308475/0/Maya-Stadium-wireframe-2.jpg
- http://briefcase.pathfinder.gr/download/Marcos_Aurilius/15827/308476/0/Maya-Stadium-wireframe-3.jpg

Thank you all for your cooperation.

mthemelis
09-01-2004, 11:57 PM
πατριώτη καλά να περάσεις στον στρατό

μηπως εχεις το μενταλρει σε σταντ αλοουν εκδοση?
το ψαχνω καιρο
κατα προτιμηση το τελευταιο γ,γ

τα γράφω στα ελληνικα για να μην μας κανουν τσακωτους!

μιχάλης θεσσαλονίκη

CGTalk Moderation
01-19-2006, 12:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.