ThE_JacO, I’d be curious which microblades would even allow overclocking ;). Most blade anything I’ve ever seen are running server motherboards with server CPU’s that don’t support it - let alone the warranty issues at stake with the vendor.
dual xeons are still at best 20-30% more power efficient than an overclocked single CPU system when considering the rendering times. 20 dual xeon systems floored for a solid month would translate into around $100 savings a month vs a similar-performing overclocked render farm. Considering blades and rackmount servers cost usually 3-5x as much as an overclocked consumer-grade PC, the power savings won’t offset the hardware costs unless you’re going to use those machines for 15+ years with no future upgrade plans.
The physical space, warranty, and tech support staff issues are for sure a real concern. Those become more important as a farm gets larger.
overclocking a small render farm isn’t recommended unless you know what you’re doing. It takes some knowledge, testing, and refining. You’re giving up thermal headroom. Any machine can lock up or restart if it runs too hot for too long. If you overclock high, you’ll have at most 10-13 degrees C of headroom before the CPU starts throttling. Xeons at stock speeds will give you 30-40 degrees C of headroom so they can go a lot longer if your A/C unit breaks. It’s wise to invest in a USB temperature email/phone alert monitor so team members can be alerted if the room temp goes above a threshold.
I’ve been overclocking our small render farm for almost 10 years. Had a few fans, motherboards, power supplies, hard drives, and ram die over the years, but no thermal nuclear meltdowns or melted components (contrary to a lot of the misconceptions). Meanwhile, I’ve had professional servers have the exact same issues, just less often. Everything still hinges on the A/C and those little system fans no matter what hardware it is.