Polygons Per Gig?


Is there a guide to how many polygons can be handled on a per gig basis?

Texture/Material free, raw poly scene.


As a basis, each polygon (quadrangle or triangle - not considering n-gons) has 4 LONG indices into the vertex array - triangles simply have duplicate indices for the last two. Each vertex is only listed once in the vector array (ignoring split surfaces and other things) and contains three FLOAT values (x, y, z).

Each polygon is 4x4 bytes = 16 bytes
Each vertex is 3x4 bytes = 12 bytes

So, if you have a polygon object with 24576 (x16) polygons and 24578 (x12) points (a cube with 64x64x64 segments), it would occupy (generally speaking for the polygons and points):

393216+294936 = 688152 (688 KB)

Of course, in C4D, the Object Properties shows the memory as about 1440 KB (1.44 MB). That is because of Phong and UVW. Remove them and it shows as ca 672 KB.

To continue on to the actual problem of determining how many polygons can fit into 1GB memory. The problem, as can be seen, is that the number of points and polygons are somewhat independent. You can have N polygons all split with their own points or those polygons all sharing points to some extent (think planar grid or even a circle with a common center point). Generally, in a quadrilinear grid, almost all points will be shared by 4 polygons (except the edge points). So, if we just say naively that for each polygon there is one point, we get a value (where 1GB = 2^30):

1073741824/(16+12) = 38347922 polygons (ca 38 million)

This would only work if nothing else occupied the 1GB of memory - which is unlikely. And C4D typically has several copies of the object floating about (caches and such).


Curious then.

I’ve felt the amount was around 2 million per Gig of Ram

With 12 GIg I’m comfortably able to render 20 million polys with textures,
but I’m wondering how much further I can push it before running into trouble.


As you probably already suspect, textures, UVW mapping, everything else occupies memory as well, including the applications, services, libraries, OS and so on. Even the hardware grabs areas of memory for work or swapping. The intrinsic memory available to hold polygons will be limited to what is actually free and then you might need to quarter or more that to consider the undo buffer and various caches.

Let’s assume that out of the 12GB, 2GB is being used (overall by the computer) when C4D is running with nothing in a document yet. So that leaves 10GB of addressable space in 64-bit environment. So, we calculate that we can get 10x38million polygons = 380 million. But lets be less naive and figure that the relationship isn’t 1-1 for points and polygons. So, this might decrease the value to, say, half = 190 million. Then we need to consider undo buffer and caches so we divide that by 4 = 47.5 million. Add in texturing and UVW (which occupies A LOT of space since there are unique points per polygon) and this would drop again by maybe half = 23.75 million. Sounds close to what you estimate.


To sum up here, if you want to really push the number of polygons in a scene:

  • Remove as many running applications and processes (services) as possible. Turn off internet connectivity (and thereby virus protection - which uses a lot of memory!).
  • Optimise polygon objects to avoid point and polygon duplication and degeneracies.
  • Make procedural objects editable wherever possible! *
  • Lower or forego undos.
  • Don’t use UVW mapping if possible. This is probably not something easily attainable unless you can replicate textures using procedural planar or volume shaders with procedural mappings.
  • Smallest possible image textures (when UVW mapping is used).
  • Etc. optimizations.

You may be able to push your numbers up to 3 or 4 million polygons per 1GB memory.

  • Added this as deformer caches, for instance, are big memory eaters. The less caching you can incur by going to polygon objects the better.


Thanks, I’m an 8 year Cinema veteran Salutes
I get the need for optimization and pulling unwanted tags etc, just wasn’t sure how all that poly data was handled by RAM. At the closing stages of this project, I may run through and just see what maps could be reduced, so I can throw a little more greenery in there.



Based on experience, with a textured and lit scene, youll probably get about 15 million polys in a scene with 3 gigs of ram available for c4d (ie, windows 32bit with the /3g switch on a machine with 4 gigs installed) before it keels over and dies.

Texture heavy and itll be about 12 million polys.

So, to the original question, realistically about 5 million per gig i’d say


Prior to switching to a 64bit OS, I had many experiences where with even 3gig switch on, Cinema was long dead approaching the 5-6 million mark.

It had been a massive creative restriction, knowing my architectural work couldn’t have 100 medium res trees or the high detail in every area. I was very disgruntled and stupid for having stayed within those restrictive bounds for so long, yet had grown accustom to spending many hours refining and reducing unnecessary calculation.

I definitely recall that figure in my head - “I’m approaching the dreaded 5 million polys again” was the common theme back when.


keep in Mind C4D prefers polygons to objects so if you had 100 trees but connected them to be a single object it would be more efficient than as separate objects.


I appreciate the tip, why is it more efficient this way?

I’m not one to piss and whine about missing features, but if Modo (relatively new and not nearly as flexible as Cinema) can have a true instancing system. I really do think its high time we got one of our own.


I’m at about 7 years myself. Mainly, I mentioned these ‘optimizations’ for the ‘general public’. It was basically understood that you already had the experience to understand and use these ideas. There was no intention to challenge your experience. :love:

Nonetheless, I hope that the overall process demonstrated why the ‘theoretical’ number of polygons per 1GB memory probably isn’t attainable without foregoing many other necessary features which must be considered.

The idea of grouping a bunch of objects into one object for better performance is an interesting C4D phenomenon. Don’t know why myself but have tested its veracity - and it indeed increases performance (especially in rendering). :shrug:

Aside: There will be an interesting time when someone will create a polygon object with the number of vertices that will reach the maximum indexing of a signed 32-bit number (2^31). It is a large number (2 billion), but still it will probably quickly approach for the most advanced professional users pushing the envelope on expensive computers. I was going to say about this that one hopes that Maxon has a future mindset to change the polygon indexing to 64-bit sometime before that happens. Ten years ago, how many people would not cringe at the thought of millions of polygon objects? Yet here we are pushing the number into the tens of millions!


I tested the following…

Having 4 Gigs of ram using the 3GB switch.
Creating a Landscape Object with 1000x1000 Polygons= 1 mill. Polys.
Copied it 9x so i have 10 mill. Polys.
Set 1 light with softshadows and a added a 1024*1024 textur on every Landscape.
For some reasons it wont matter if there is just one light or an array with 15 lights?

It renders, but close to the “out of memory” message.
Having more than 10 mills in the scene is not renderable! the filesize for 10 mills is already 1.38 GB and even in Batchrendering it wont render.

Cheers djart73


If I do that here, 10x 1 million poly landscapes, soft shadow light, a single 1024 texture on each plane:

200 megs used loading c4d
1.2 gigs used when all items are in the scene
render the image and it jumps to 1.7 gigs in use

This still leaves another gig for working with on a 32bit windows system, or another 2 gigs on 32bit macs


only with the /3GB set on windows 32 bit systems - maybe that is the discrepancy of the other user djart73



I didn’t mean to come across as sarcastic either, sorry for the confusion man.


He said he has it turned on, so im not sure where he’s losing his ram.


just a shot in the dark - do you think he may have one of those on board video cards that takes video ram from the system ram? something odd like that?


Ok, guys, i am here, you can talk to me in person :slight_smile:

So, i made some screenshots for you... maybe my ram gets lost because i pressed "C" to convert the landscapes to real polygon objects.

So, my 3gb switch system renders to 2,93 GB. over this, the message "out of memory" appears. 
With having everything closed my PC runs with a pretty low amount of MB, i think.

Image 2 is with C4D started.
Image 6 is the saved filesize.

But please take a look by yourself, maybe i´ve seen something totaly wrong?! :-)

So i am not shure if it is possible to render scenes with more than 10 mills on an 3gb switched Windows XP based PC?

I think working with a 3gb switched PC allows you to work stable within until 2-3 mills of polys. But thats just my opinion.

Cheers djart73



there are variables beyond just the 10 million polys… so could be other factors…

sorry i thought maybe mash was talking with you direct about it which is why i replied to his post.

will try to take a peek later if i can,



is something like SPD on and causing additional ram eating?

what is the texture? just a flat color? (i.e. nothing hidden in that shader right?)

kinda hard to tell really from the screen shots much more - i know i had issues with anything over 2.7GB on my /3GB and windows 32bit when i had that…