Nice article. This is a good intro article, that I’m sure many people will have critiques on what they think is are possibly better practices.
It's a complicated subject that's more in the IT realm than CG creation, but many of us have begun to dip our toes into linux as it's become more viable for smaller studios.
My only comments about the linux setup is that though disabling SElinux is probably the simplest way to get linux to permanently leave you alone with network sharing issues, you can alternatively enter (as root):
setsebool -P samba_enable_home_dirs 1
for enabling home directories to be shared
and
chcon -R t samba_share_t /your-directory
these commands make SElinux allow those directories to interface with samba while still allowing SElinux to do its job.
Maybe I missed it, but was there any mention of how to configure and set up a mounted directory so all your render nodes can grab textures, etc from a server when rendering? That becomes a fundamental thing when you start having more than just a handful of machines.
Unless I didn't see it, there's no mention of tools that simultaneously control multiple machines at once like Cluster-SSH (CSSH) or like that one vfx IT guy mentions like Salt. IMO this is important for after you've deployed your machines and you need to say for instance uninstall an old version of maya and install the latest service pack. No one wants to go through that process for each machine as they maintain their render farm.
Also, I just want to point out there is a definite difference between renders between linux and windows on mental ray at least. It's not always present, but if for instance you make use of really intensive AO shading, the sampling noise patterns are different between platforms. If the sampling is cranked beyond normal sane usage though, I'm sure it's not very noticeable.
1/6 of the article was dedicated to how to configure linux to set machines to sleep, but most people have their render farm going 24/7 with multiple projects or test renders etc. Anyway, I’m sure some people really want that feature, but I’d think most people set up a render farm because they don’t have enough rendering performance and the times when the farm isn’t floored are rare.
Power and air conditioning weren’t mentioned anywhere. Most CG artists have no idea how much power a rendering machine will draw, how many each power outlet circuit can handle, and how much A/C you need to cool the machines. It might even be noteworthy to mention a few of the companies that make USB temperature monitors that can send emails or phone calls if the temps go above a given amount in case the A/C fails.
Anyway, good article. I myself sure wouldn't have attempted to write it because of how deep the subject can go. I feel like I barely know enough about it to handle my own work with my own people with our relatively small 20-computer farm.