Data Centers, 110VAC or 220VAC

Is there any benefit for servers in a data center to use 220 (or 208, whatever it is) volts instead of just regular 110VAC? Obviously lower amperage, but doesn’t the wattage balance out in the end and cost wouldn’t make a difference anyway?

It’s more efficient, and you can fit more power into a rack (with fewer circuits). However, you may need a step-down transformer for stuff that won’t run at 208V. If you want to run blades, you would probably need to take a step up to 3-phase power (which is an adventure in itself).

Keep in mind that you don’t want to load the circuit to more than 80% of its rated load so that you can accommodate spikes. Therefore you can fit a lot more onto a 30A 208V circuit than you can a 30A 110V circuit, with more headroom.

Do you have a project coming that that you’re planning for? Are you being given different prices (install and monthly run-rate costs) for 110V vs 208V?

They use 220, 221 - whatever it takes.

There are always things that get added here and there that need 208vac (just the random IBM hardwares) but for the last couple of years we’ve had the electricians run L21-20p connectors for these since we can load up a rack with about 20 servers in some cases. But recently they are asking if we can switch over to 208 for “cost savings” reasons.

The 3 HP Blade Chassis we have now even use 110vac, just need special plugs for the power supplies but i know i can get different power supplies for 208 or 3-phase. I’m just wondering how switching voltage would be that effective?

It’s cheaper to wire an apartment building for 120/208 3-Ø because there is more power available for the same size wires. The service entrance will cost more but you can save on the conductor costs that are run to the equipment, that would be the biggest reason for the electricians to want to switch. 120V will still be available in that type of installation.

So in terms of power consumption it is still the same right? Watts = volts * amps. If a server uses 500W @ 110 * 4.5 it is the same @ 220 * 2.25 right?

Mild hijack, why isn’t 3phsae 330vac, why the 208?

The power consumption of the server will be the same. The formula for single-phase power is volts x amps. The formula for three-phase power is volts x amps x 1.732 (the square root of three; I don’t have a square root symbol on my phone) but you would only feed one phase at a time to the equipment if it’s rated for 120 or 240. That factor of 1.732 is also why three-phase is 208V. 120V x 1.732 = 208V. With single-phase you have a sine wave which reaches a peak, crosses zero, reaches the negative peak, crosses zero again. With three phases the waveform is smoother because, while one phase is crossing zero the other phases aren’t. In a 120/208 system you get 208v from two hot wires and 120 between any hot and neutral, assuming you have a neutral-tapped transformer which you undoubtedly will in your case.

Three phase power is always mind-bending to me. One thing we’re seeing in our data center is that our phases are imbalanced at the racks, and at the panel. What’s the impact if the phases are imbalanced?

The biggest problem I’m aware of is with overloaded neutrals. What comes in must go out. In a balanced system you’ll have the same current in all the conductors; when unbalanced, one line carries more than it should and gets hot. When you put an ammeter on a neutral wire you can measure surprisingly large currents if there’s a problem. Theoretically you would have three "hot’ wires in a three-phase system and the neutral won’t carry anything but if there are a lot of ‘noisy’ loads, ballasts for fluorescent lights, motors, etc. then the neutral carries more and more current. In practice they often run a neutral for each ‘hot’ because that’s easier and cheaper than trying to balance the loads.

So that would work if I’ve got 110V loads, but if my loads are all 208V, and I’m running more on A and B than C for every rack in my data center, is there anything I should be worried about back at the panel?

Not at the breaker panel (not to my knowledge at least). The hot wires are protected by the circuit breakers. If there were going to be a problem it would show up at the transformer; you would measure a large voltage drop on the heavily loaded phase or you might see heat issues.

A lot depends though on the types and configuration of transformers used - star or delta or a combination of both, the k-factor of the transformer. Also, there are single-phase 208V applications and three-phase 208V. If your phases aren’t imbalanced too much (and some imbalance is very common) and you aren’t having any issues from it then it’s good to be aware of it but nothing to worry about.

Yes, but the loss due to the resistance of the wire is proportional to the amperage. With a lower voltage, you have to push more amps which means more power is wasted as heat. Also, with fewer amps, you can use smaller wires without fear of them burning up. This of course saves money. Both of those reasons is why electric companies use very high voltages until they get close to your house.