Need advice about Xeon render blade please

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

THREAD CLOSED
 
Thread Tools Search this Thread Display Modes
Old 03 March 2013   #1
Need advice about Xeon render blade please

Hi everyone.

My studio is considering buying our very first render blades. My boss really wants to buy DELL 6 blade rack which I haven't gotten the spec yet. So I've been asking couple other vendors. I've got some recommendations, most of them are Xeon 1200 and 2600. Are those 2 a good CPU for rendering? Are there any other XEON model I should be aware of?

Here are some of the links the vendor sent me
http://www.supermicro.com.tw/produc.../MicroCloud.cfm
http://www.supermicro.com.tw/products/nfo/FatTwin.cfm

I feel stupid but I don't even know how to read if each node are using single or dual Xeon... any advice would be greatly appreciate.

thanks!

Last edited by Panupat : 03 March 2013 at 03:51 AM.
 
Old 03 March 2013   #2
Hey man,

Is there a reason you need things in such a high density? You really pay a premium for those 1U blades. The 2U cases will hold regular ATX spec parts, and are much cheaper. The E5-2600 are the dual socket chips, and the E3-1200 series are the single socket.

-AJ
__________________
 
Old 03 March 2013   #3
Ah I wasn't aware of that. Thanks for the tip! E5-2600 is also available for ATX size?

Looking at this page, how can I tell if it's 1U or 2U?
http://www.supermicro.com.tw/produc...-F617R2-RT_.cfm

Last edited by Panupat : 03 March 2013 at 10:45 AM.
 
Old 03 March 2013   #4
Those are actually 8 two socket systems in a 4U chassis. Currently that should be the highest packing density you can get.
Keep in mind that this is not actually a blade system, but 8 independent computers that only share a redundant PSU. There is no common backplane etc.
This kind of system is usefull if you need many simple plain single systems in a tight package. For a renderfarm it should be quite good, provided you have the rackspace to mount it. This system will be very loud and will need a climate controlled environemnt.

Cheers
Björn
__________________
- www.bonkers.de -
The views expressed on this post are my personal opinions and do not represent the views of my employer.
 
Old 03 March 2013   #5
Following what Srek said, if you're at the point where you have to ask about such things, I wouldn't recommend for such a high density per square foot.

Buying components on a farm is based on a lot more than just picking the right units.
What you can or can't afford in power, climatization and surface are important determining factors.

Don't you have a reseller around who's set up other farms before and you can rely on? A preferred provider maybe.

You will need the right network and storage bandwidth as well for a dense farm not to become more of a bottleneck than it actually helps.
__________________
Come, Join the Cult http://www.cultofrig.com - Rigging from First Principles
 
Old 03 March 2013   #6
Originally Posted by Panupat: Ah I wasn't aware of that. Thanks for the tip! E5-2600 is also available for ATX size?

Looking at this page, how can I tell if it's 1U or 2U?
http://www.supermicro.com.tw/produc...-F617R2-RT_.cfm


Hey man,

The ATX sized parts are what you'll find in a normal desktop or a large, cheap server. Their bigger and cheaper than what you might find in a compact server. They make ATX motherboards that will hold a pair of E5-26000 chips.

-AJ
__________________
 
Old 03 March 2013   #7
Originally Posted by ThE_JacO: Don't you have a reseller around who's set up other farms before and you can rely on? A preferred provider maybe.

Thanks for your input Jaco. That's the thing - there are only a few studios in Thailand with render blades (mostly Dell) and rarely anyone with decent level of knowledge about them. 3 other vendors I talked to knew nothing about 3d rendering and I seriously doubt Dell's sales would know any better. I think my studio already appointed a sale to come talk to us this week, I'll find out.

I guess I have some vague idea what I should be going for... 6 blade, 2U, twin xeon 2600 socket, 24Gb of ram. On top of that, a file server with decent write speed and at least 3-4 NIC.

Probably will need a climate controlled environment no matter how dense the blades are. I almost melt on my 5-minute walk to lunch today.
 
Old 03 March 2013   #8
If you need only six render nodes I'd look at what gives you the best bang for the buck. High density server hardware isn't the best bang for the buck but offers benefits for large deployments. Forget the 1U enclosures and 6U rack unless you need it to be portable or you plan to scale the farm out to many nodes (dozens or hundreds of nodes).
__________________
http://www.whenpicsfly.com
 
Old 03 March 2013   #9
Thanks Olsen. The most we'll ever have is probably 3 boxes, 18 nodes total.
 
Old 03 March 2013   #10
Originally Posted by Panupat: Thanks Olsen. The most we'll ever have is probably 3 boxes, 18 nodes total.


What software will you be using on the farm? Cinema 4D, Maya, After Effects, Nuke, etc.
__________________
http://www.whenpicsfly.com
 
Old 03 March 2013   #11
yeah, if you're only getting 6 blades (which are going to be expensive), I wouldn't bother getting blades. Surely you have enough room in your studio for 6 mid tower cases. With the money saved, you could easily get probably 8-10 non-blade machines of equal performance. Xeons are expensive either way, but you pay a small premium for them being shrunk down so small into blade rackmounts.
 
Old 03 March 2013   #12
Originally Posted by Panupat: Thanks for your input Jaco. That's the thing - there are only a few studios in Thailand with render blades (mostly Dell) and rarely anyone with decent level of knowledge about them. 3 other vendors I talked to knew nothing about 3d rendering and I seriously doubt Dell's sales would know any better. I think my studio already appointed a sale to come talk to us this week, I'll find out.

I guess I have some vague idea what I should be going for... 6 blade, 2U, twin xeon 2600 socket, 24Gb of ram. On top of that, a file server with decent write speed and at least 3-4 NIC.

Probably will need a climate controlled environment no matter how dense the blades are. I almost melt on my 5-minute walk to lunch today.

Do you have space problems?
Or do you already have a climate controlled area for things such as servers with the racks already set up?

High density is rarely the way to go unless you tick both the above boxes.

A taller and easier to climatize set up with more and more straight forward networking might see you better off.

Also what engines and requirements?
Once you get a wider array (many CPUs and cores) and a lot of it is dual or quad procs, you also have to bear in mind they share memory, and, guaranteed, you will at some point want to split jobs more granular so you can have multiple running per CPU.

If your average job now can cap 24GB on a workstation, for a farm with duals seriously consider at the very least 32, if not more.

Wattage is also important.
How expensive is power there?
High density means lower power consumption per cycle, but higher power consumption in cooling per watt produced (residual heat means cooling can't ever be lazy, which means it hardly ever cycles off).
If you have expensive power bills, but an already under capacity climate controlled area, they are great. If it's the opposite, then you might want to go lower density with a taller and easier to cool rack.

Just keep these things in mind.

Other things like cost of licenses and licensing schemes also contribute.

A farm doesn't finish costing you money once it's set up, it just starts there, and people often under-estimate how impactful a poorly chosen one can be on your bills for power and software.
__________________
Come, Join the Cult http://www.cultofrig.com - Rigging from First Principles
 
Old 03 March 2013   #13
Oh so RAM are shared? Do they share power supply and NIC cards too?

@sentry66, @Jaco thanks for your reply, really got me thinking. Apart from saving space, are there any major pros for going blades? We have no space problem. Actually, we have lots of space. And since our country is really hot, our room is already air-conditioned.

Would an air-conditioned room considered climate controlled? Is there more to it than that?

I'm thinking the farm would be used primarily for Vray which I believe that Vray for Maya comes with 10 distributed/standalone license. Some of them may also run 3Delight and PRM, it all depends on our investor.

Last edited by Panupat : 03 March 2013 at 02:01 AM.
 
Old 03 March 2013   #14
What they share depends on the build, type and so on.

PSU is usually shared as in you only need one plug per blade, and internally it will draw and power what it needs to, but there are many offers where you have two, and good blades often have redundant power supplies in case one fails (which is something you can put in any server tailored case too, if you need to).

NICs depends, some have multiple regardless of the number of CPUs and mobos (plates) hosted, some have one plug and deal with splitting and offering multiple IDs in a managed way, some will have fiberchannel too.

Climate control is about the inside of the chassis being at reasonable temperatures, so if you have an air conditioned room hosting forty workstations comfortably, it won't be a problem to add 10, whether you use them for distributed rendering only or seat someone in front of it it matters absolutely nothing, if they don't overheat, they will keep churning frames out. That's the whole extent about climate control.
With racks you have to be more specific and careful because it's a lot of heat in a small space, but the principles don't change.

Workstations are, from a power and running costs point of view, very rarely advantageous over more compact solutions.
They will be cheaper in terms of casing and management, but they usually aren't as optimized heat and power draw wise as blades can be. It doesn't mean they aren't an option though. Again, space and power being the difference between a computational centre and a bunch of workstations.

There is nothing magic about hardware inside blades. If it can fit in one, the equivalent can fit in a case if you prefer that.

That's why I was stressing those points.
People think of a renderfarm as if it's some sort of magical, abstract entity... It's not, it's just a bunch of computers, end of story.

Power, space and computational needs and constraints dictate whether you need one in racks, or you can pile up some cases on a desk.
It's all about logistics, end of story.
__________________
Come, Join the Cult http://www.cultofrig.com - Rigging from First Principles

Last edited by ThE_JacO : 03 March 2013 at 03:09 AM.
 
Old 03 March 2013   #15
Thanks Jaco. Yes you're right, at first everyone at my studio (me too) thought a render farm is something specialized. I'm relieved to know that each blades are just like regular PC

How much faster would a Xeon 2690 have over 2650? The CPU price alone is almost twice more expensive?

Last edited by Panupat : 03 March 2013 at 11:43 PM.
 
Thread Closed share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright ©2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 01:25 PM.


Powered by vBulletin
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.