4 Quad Xeon MoBo for Render Farm??

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

THREAD CLOSED
 
Thread Tools Search this Thread Display Modes
Old 03 March 2013   #16
you could easily get away with onboard video... or even without video at all. I've built images for systems that were headless before, its' not to difficult. Look into DISM for windows to build a custom auto unattend install for each machine. That way you have a base image for distribution, and can configure custom system details with a simple XML file. Once you get the hang of that, its' a breeze.
__________________
-- LinkedIn Profile --
-- Blog --
-- Portfolio --
 
Old 03 March 2013   #17
Our render closet has it's own air conditioner and dedicated power line, so it's always 70 degrees F at ambient temp.

Our render nodes are:
asus sabertooth x79 motherboard - one of the better purpose-built overclocking boards.
Obviously a 3930k
noctua NH-D14 SE2011 cooler
corsair vengeance 4x4Gb 1600mhz low profile ram (low profile is important for noctua cooler)
cheap $30 geforce 210. I wish the motherboard did have onboard video:(
cheap hard drive (or could be a USB3 stick) for the OS
Corsair 200R cases - fantastic case for render nodes BTW, really quick to install.
corsair 750 watt gold certified power supply.

A 550-650 watt would be ok. I prefer Seasonic power supplies, but the corsair was $10 cheaper for the 750 watt and has a good reputation. IMO having gold certification is important since it lowers energy costs, produces less heat, and makes your air conditioning work less.

when everything is added up, it comes in around $1400 without an OS


I have a high speed 110CFM 120mm exhaust fan out the rear
I have a normal 120mm exhaust fan out the top of the case in the rear position
I have a high speed 100CFM 140mm intake fan on the top of the case in the front position, normally people might use this fan position as an exhaust, but I don't see the point in that since the CPU cooler is right there.
I place the graphic card on the 2nd PCI-E slot, away from the CPU

Then I place a 120mm fan on top of the graphic card angled at a 45 degree angle blowing air down at the motherboard and base of the CPU cooler. IMO this is a fairly critical fan placement. I zip tie one of its corners to one of the noctua's fan clips so it stays put.

Don't go with liquid cooler for 24/7 overclocked rendering! Water pumps can fail, heatsinks can't. You won't be around to know the pump failed...and the pump software usually isn't linux or mac compatible. IMO liquid coolers should only be used for systems you are actively in front of or high end servers with a 24/7 IT staff on duty. Even if a pump has good reliability, on an overclocked system their failure rate is something like 20, even 30% sometimes because they have to work hard all the time.

The other thing I do, is I've had power supplies fail because their internal fan failed. I don't like the entire fate of the power supply, and thus the computer hinging on that single fan so I take an 80mm fan and zip tie it to the rear of the power supply to act as a secondary backup exhaust fan. I put a fan guard on it if I have any and route the power cable in through an open PCI slot. I've had low priority computers run for years off of these tiny backup fans with a dead main fan in the power supply.This is my form of power supply redundancy - a $2.50 fan

No one here will probably be willing to do this, but I also take a little time to lap the CPU's and heatsinks since they're not even close to being flat. Lapping shaves a few degrees off and gives you some more headroom for overclocking and will help make the chip last longer by being able to transfer heat to the heatsink quicker. You can lap two CPU's at the same time, one in each hand if you have a large enough sheet of glass in about 40 min if you start with 320 grit sandpaper. It voids your CPU warranty and has a risk of damaging the CPU if you're not careful. Before doing it, I check that the CPU works first beforehand.

I don't know if anyone else has noticed this, but CPU temperatures are lower by a few degrees C in Linux than in Windows when rendering mental ray scenes or even just sitting at idle.



If we didn't have to pay for all these expensive maya MR batch and Smedge licenses, I would rather have gone with 2600k i7 machines in mini ATX form factor with smaller cases and less extreme overclocks. We'd have 3x as many machines, roughly 50% more cpu power for the same money, and the machines would be semi-disposable coming in under $600 each. In the end the 3930k 2011 platform was better for us as it lowers the machine maintenance, physical machine footprint, and software licensing costs. It's also nice in that our workstations are 3930k's so it makes render times more predictable.

Last edited by sentry66 : 03 March 2013 at 01:32 AM.
 
Old 03 March 2013   #18
Great writeup sentry66, thank you.
I am not nearly technically capable or informed as you do, so I have to re read that post a few times more to take it all in.

The difference of my setup from yours is, well, besides all those custom made things you have mentioned , that I already have a 46U AV rack and 20U room to put all those necessary parts. (In case google is not mistaken Corsair 200R cases are tower style);
4x2U rackmount cases, 2U for NAS, 1U for ethernet hub and some spare room (whatever I may need, probably another UPS for starters).

But the main problem I see/will face is the cooling of the components in the rack.
The rack has only 4 fans on the top, 3 in 1 out and that's it.
With all the AV gear, its already barely holding on to 80F while fans keep working 30% of the time.
The defining factor here is, even though the rack is in my basement, the ambient temp is never lower than 75.

I could get another rack (I prefer not to), lets say 22U, but the cooling solution will be the same, at top with max 4 fans, unless I build something custom.
Maybe I can put 4 fans at the back door, but then every time I want to plug something out I have to deal with the fan on the door (might keep the power cable long, that will solve that problem )
But the hot air rises and such so the fans on the back door will be less effective.

If you think that I can solve the cooling problem (to a degree) with such a setup, instead getting a new rack, I would like to use it on the one I already have, but I'm afraid as its higher the fans will make less of a difference..

Btw, the only solution that the rack company recommended is to get another 4 fans (3in 1out) and put it directly on top of the highest heat source (which will be the pc cases).
But than wouldn't it only mean that I will blow hot air to the component above the fan (lets say receiver) and blow the already hot (ambient in the rack, 80) air to the cases?!! That looks pretty lame to me.

What do you recommend?
 
Old 03 March 2013   #19
Yeah the corsair 200R cases are regular aluminum tower cases so they won't work in a rack and don't have the strength in the front for mounting.

Here's a link to the carts we're using that we stack them on:
http://www.homedepot.com/p/t/202361...tryId=202361002
I think their price is quite reasonable, though you have to assemble them. We can fit 12 machines on each cart, 3 rows of 4. I know it's not an "enterprise solution" and the IT guys will sometimes tease me, but the hell if I care. It works great for a small setup.

What I like about the carts is I can move them around. Even in a small space, I can slowly rotate them to get to the back of a machine, or easily relocate them. Racks are generally stationary and heavy with all the heavy duty steel construction all the cases have - steal is also worse for letting heat out than aluminum.

I'm really the wrong person to ask about for rack advice because I avoid racks at all cost unless space finally has become an issue and smaller footprint machines are required. What I do know is the cooling in rack cases absolutely sucks most of the time unless you have those custom A/C enclosures the big renderfarms or super-computing centers use. I'd have to see your setup to really understand it, but it sounds like you're probably right about recycling hot air into the other rack components.

With rack cases, usually the only intake or exhaust points are the front and rear of the case. Very few rack cases seem to have side intake or exhaust locations. Typically the majority of the fans are up front, creating a positive pressure case scenario which will bring in a lot more dust into the case and not do well at getting the heat physically out of the case.

Negative pressure cases are better for 24/7 heavy computing while positive pressure setups are better for short term heavy computing or gaming. Positive pressure will get the heat off a component quickly in the short term, but usually creates more turbulence which will slowly bake the ambient case temps.

The smaller rackmount machines have those crazy high speed small fans that are like loud dentist drills. When I first started working where I am, we had a bunch of those type of small rackmount machines and the sound would drive you nuts. I figure we had a whole room to use up, why not go with larger machines that can utilize larger fans that won't drive you insane?

So anyway those issues combined with the low profile the units, I really don't think I'd recommend overclocking in rackmount cases except maybe 4U ones that have some good high powered exhaust fans. Even 4U cases aren't tall enough for a Noctua NH-D14 SE2011 cooler, so you'd have to go with a weaker cooler. Given that normal i7's run slightly hotter than xeons, IMO it could just be asking for trouble.

Last edited by sentry66 : 03 March 2013 at 01:43 AM.
 
Old 03 March 2013   #20
I know what you mean about the noise, especially in the 1U cases, that's why I was gunning for 2U, at least they have a tad bigger fans (8cm vs 6cm I think) so they turn slower/silent!

And, no, I do not think I'll overclock them, unless I invent some stable and constant nitrogen cooling technology
Still, for best bang per buck 2600k is on the lead, but if i get 3x3930k instead of 4x2600k I'll be able to cut some expenses from all the other components, and invest more on cooling, meeh, we'll see, gotta learn the heat differences between 2600 & 3939 too.

Thank you for your help mate, I'll study some more about the cooling solutions on high U racks, will check back with the findings, if I can find any..

cheers..
 
Old 03 March 2013   #21
no problem, glad to help
 
Old 03 March 2013   #22
Thread automatically closed

This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.
__________________
CGTalk Policy/Legalities
Note that as CGTalk Members, you agree to the terms and conditions of using this website.
 
Thread Closed share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 06:47 PM.


Powered by vBulletin
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.