PDA

View Full Version : How Rendering Uses RAM


malaclypse
12-26-2004, 04:59 PM
Hey people, after looking around, I've often seen people buying up to 4 GB of RAM for their rendering systems or workstations. I was wondering why so much is needed, and so I made a little calculation which helped me and might helped others understand:

When rendering, you usually use full quality 32bit Targa maps.

Let's make a few assumptions; you are rendering a huge, highly detailed scene, with 1 huge spacecraft, 3 medium ones, and 6 small ones, a space station, a lot of props, and a planet in the background.

We also assume that you are going full quality with diffuse, specular and bump map.

Here are the textures needed, I'll give an example of how i calculated for the first one:

- Huge spacecraft: 4096x4096 texture, 32 bit diffuse color, 8 bit specular, 8 bit bump.
-- diffuse map: 4096x4096*32/8/(1024^2) = 64 MB
-- bump map: 4096x4096*8/8/(1024^2) = 16 MB
-- specular map: 4096x4096*8/8/(1024^2) = 16 MB
Total = 64 + 16 + 16 = 96 MB

- Space station is the same: 96 MB
- Planet is the same: 96 MB

- 3 Medium aircrafts:
-- diffuse map: 2048x2048*32/8/(1024^2) = 16 MB
-- bump map: 2048x2048*8/8/(1024^2) = 4 MB
-- specular map: 2048x2048*8/8/(1024^2) = 4 MB
Total = (16 + 4 + 4) * 3 aircrafts = 72 MB

- 6 Smaller aircrafts(1024x1024): 6MB Each
Total = 6*6 = 36 MB

- 32 props(512x512): 1.5MB Each
Total = 32*1.5 = 48 MB

So, all these textures would sum up to 444 MB

Then, the final image, let's say you are rendering at
5120 x 3840 4:3
that would make 75 MB.

You're also using an HDR Image, which is 2048x2048 - I don't know how big those are usually but let's say 32 MB.

You are using global illumination or something similar, and therefore you have a photon map: 64MB?

You have a shadow map for each object(would you use shadow maps and GI in same time?) of 4096x4096, I suppose shadow maps are like greyscale images - don't know actually. But that would be 16MB*10 aircrafts, so 160 MB

I add all this to the previous total and get 775 MB

fewf. So thats it for the textures and maps, there is probably much more to it though, like the light maps, you may also use translucency and transparency maps.. so this could get over 1 GB.

Then you have to store the scene in memory, which can be quite big, and the Renderer will need memory to run, as well as the OS.

That should put you at around 1.5Gb...

I don't know about other things which might take up RAM, but I'm sure there are, please give me feedback on this, and on how accurate this is - It is really an estimation on my part and I'm not saying this is am accurate result.

cheers

Alex.

simoncheng
12-27-2004, 02:53 AM
this is one of my major problem.
not enough ram problem.
you won't believe that i try to render a scene that 20mb files size of model.
my pc ( xp pro ) tell me that out of ram.

my spec - p4 2.8 ht
2 gig ram
geforce fx 5700le
200 gig hard disk

i m not a pro user of 3d max or maya.
i m a cad user. (ironcad)
how could this happen? could some one help me up.
is there anything i can do? tweak my system/software?

i really happy you brought up this topic.

simoncheng
12-27-2004, 02:57 AM
paging my system will help anything or run memory under program/system cache?

hope someone reply.is this happen before to 3d max user?

thanks.

simoncheng
12-27-2004, 03:19 AM
i m sorry about early. what i mean 20 mb is only data file.
the total is 400mb including texture/map.
what i heard is window can't run any better after 2 gig, is that true?
or you can even change your setting at the task manager.
when i try to change my setting, the warning signal appear.
can't believe with 2 gig ram still having problem to render 400mb .

malaclypse
12-27-2004, 01:17 PM
Maybe that 400MB file is in a compressed form, and are you sure the textures are IN the file?

alex

simoncheng
12-27-2004, 01:21 PM
yeah , i m sure for that.
this really trouble me. have this happen to you before?
is there anything setting i can do?

imashination
12-27-2004, 02:28 PM
The scene file will not equal the ram needed. For example the file could store an object with a simple text description:

Sphere, 1000 divisions, corrdinates 2333, 5666, 4222

This takes a few bytes of memory. But at render time that needs to be turned into a huge matrix of points and polygons.

Without much effort you can make a 1 meg file which needs 1 gig to render.

SOPLAND
12-27-2004, 03:12 PM
Someone who would know better than anyone once told me if you needed more than 2 gigs to render anything you were doing something wrong and needed to optimize your scene. It's very easy to go over 2 Gigs of memory but a decent renderer has features that will allow you to render scenes that require many many gigs of memory.

Use a renderer that renders in buckets. You can set the bucket size really small and the renderer will manage the image in small chunks. You can also split the final image into small strips as it's rendered and stich them together later.

Use a render that supports deferred file loading. Some renders have a setting that will not load objects into memory until the renderer hits their bounding box.

Use a renderer that supports some sort of block ordered mip mapped texture format. This will create LOD textures (which will also improve image quality), and will not load entire textures, just the chunk of the texture required at any given time by the renderer.

imashination
12-27-2004, 03:31 PM
Someone who would know better than anyone once told me if you needed more than 2 gigs to render anything you were doing something wrong and needed to optimize your scene.

Generally, yes. Of course there are times when what you want to do isnt technically possible in under 2 gigs, but in almost all cases when a new guy says he doesnt have enough memory, he is doing something very very wrong.

Gigantic shadow maps which arent needed
Using multi layer tiff files as textures
Using silly resolution images for tiny scene elements
Leaving in many models which are hidden anyway
Way over the top subdivisions on models

I come across these a lot.

greyface
12-28-2004, 04:35 AM
Hmmm, nice stuff. I think that before getting more RAM, you should really try to fix your scenes. After much camera repositioning you may find out that a lot of the objects are much further than you anticipated, and therefore you can reduce their polycount if they are SubDs or NURBSs.

I suppose graphic card video memory is used only while viewing your scene in the viewports?

WhItE CoFfEe
12-28-2004, 08:51 AM
I suppose graphic card video memory is used only while viewing your scene in the viewports?That's what I've heard from somewhere else.

Texlon
03-03-2005, 03:20 PM
I suppose graphic card video memory is used only while viewing your scene in the viewports?Of course! The graphics cards have nothing to do with the rendering at all, they only display the objects in the viewport. Unless of course you got a special rendering card that's particulary build for that purpose.

twizler191
03-03-2005, 06:10 PM
ratracing in light setups also eats up alot of memory. The virtual memory can be increased to help with rendering problems due to low ram; this is memory on the hardrive a computer uses as if it were ram

colintheys
03-04-2005, 04:49 AM
I just upgraded from 512 to 1gb and I must say I'm dissapointed that i notice almost no speed difference... at all. The only noticable change on my system is that photoshop doesn't get so mad when dealing with filmstrips. I'm using XSI and rendering some pretty complex models with lots of soft lights and hi-res maps. eh.

Post #200!

JDex
03-04-2005, 05:00 AM
Well you'll only get a speed improvement, if the scenes you are rendering are already eating up all of your ram and taxing the paging file. Think of it like this:

You have a car with a 10 gallon gas tank... it gets you to work each day.

If you drive 500 miles to work each day, a 30 Gallon tank would be much better for you, as you wouldn't have to stop to fill up on the way.

Of course, if you only have to drive 10 miles to work each day, the 30 Gallon tank isn't going to get you to work any faster.

Silly analogy, but it may help.

colintheys
03-04-2005, 05:09 AM
No. I think that's a good analogy. :) The annoying thing is, tho, that i got more ram cause my system utility was telling me I was using like 800 meg of ram. I upgraded and now it says I'm using 1.3 gigs for the same things. lol. Stupid windows. Either that or Aida32 is just reading out wrong. :/ But even if that's the case, Windows is still stupid. :)

JDex
03-04-2005, 05:10 AM
You won't get any arguement from me on that.

erlik
03-04-2005, 06:08 AM
You might get an argument from me, though.

I assume you're talking about when you start programs? What is eating the memory then? There must be something. Is your virtual memory too big? Various unneeded processes?

stephen2002
03-04-2005, 12:27 PM
What system utility was this?

Windows (and other OS such as Linux, MacOSX) have a "disk cache" in the RAM, where they will store basically everything accessed, in the hopes that if it was accessed then it might be accessed again. If it is, then it is suddenly a lot faster. This will essentially use up all of your RAM, all of the time. Of course, if an app actually wants memory, then Windows (or whatever your OS) will give this memory up to the app, so you should not even notice that the cache exists.

JamesBPM
03-04-2005, 05:34 PM
Thank you all for the info -very interesting :)

G_love
03-04-2005, 07:03 PM
Hello,

appreciate any help on this. it has things to do with not enaugh memory problems...I have scene with aroun 2300000 polygons, some raytraced objects (windows and so) some trees (RPC and SpeedTree2 used), some cars (RPC and mesh) and houses, lamps and stuff...classic group of houses with stafage. i used Area shadows Sun light and some other "enviromental" lights just simulating indirect sunlight.During creating the scene, everything worked fine..., but in the last part of creation (the polygon and RPC objects amoun increased much, the rendering became very unstable. I got the same message "not enaugh memory for raytracer..." or max just felt down. This always happened when memory reached Physical memory maximum (i viewed on task manager) and max didn't tried to use some virtual memory... And the machine: Dual Xeon 3,2GHz, 2GB RAM, SATA stripe raid Drives for system and swap drive...no reason for problems...but there are. The only way for me to do the renderings was do not use raytracing on glass and use shadowMap on Sun and retuche after...but i thing there must be some way to enable max use virtual memory...

thank you

ktamiola
03-04-2005, 07:16 PM
Hey laddie ... colintheys

"...he annoying thing is, tho, that i got more ram cause my system utility was telling me I was using like 800 meg of ram. I upgraded and now it says I'm using 1.3 gigs for the same things. lol. Stupid windows. Either that or Aida32 is just reading out wrong...."

With whole respect :

The way You have configured Your sys is stupid, not widows... I know people who can "compress" OS to 100 or less... And it kicks ass... If You run 200 services - like UPS, Secure Digital and even more sophisticated You should not be supprised Your sys is like old car... Run administrative tools or msconfig and see what's up...

Kamil

JDex
03-04-2005, 07:29 PM
Nah... I have my XPpro install only requiring 102Meg of memory and it still sucks. I am hoping for longhorn to make a better platform, but not expecting much. If only linux was worth using as a primary, or if OSX could run on faster hardware... Meh.

CGTalk Moderation
03-04-2005, 07:29 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.