AC vs. DC in power transmission

>I was talking to some EEs about running low-voltage DC power through a rack of computers to avoid having power supplies in each one. They told me that DC would lose too much power. Is that because of the low-voltage?

Yes, exactly. The current would be high, because low voltage systems need higher current to deliver the same power. Therefor the voltage lost would be greater for any given wire than it would at a lower current. However, you are starting out with low voltage, so a given voltage loss is a bigger fraction of the total. Since computers can only use DC power within a narrow voltage range, it would be hard to hit the window correctly (especially for the older 5 V or even more especially for the newer 3.3 V main power; the other voltages like the ± 15 V used for a few minor purposes would be easier to make work).

Q.E.D., why “sort of”? Am I missing something?

It’s not because the voltage is low, it’s because the current is high. I know, two sides of the same power coin, but I’m pedantic like that.

This is a very common technique used inside pieces of equipment. Each card is equipped with its own regulator, which reduces a higher “buss” voltage to the voltage required. Your bank of PCs could probably be powered off of 24v DC, with each PC equipped with a regulator that would generate the +/- 12v, +/- 5v. and +/- 3.3v that they generally require. Whether this would be more efficient than using AC is debatable.