Why the Sam Hill don't I have DC in my house?

Would you like to review your math?

I will remind you of this phrase once you have reviewed your math.

Just to put some numbers so that our readers can judge. (And, BTW, the formula is V = I*R and W = V * I.)

#14 copper wire has a resistance of 8419 uOhm / m (and remember you have two wires so you have to double the length. I have calculated some examples using 100’ and 200’ length of wire (x2).

A 15 A circuit at 120 V will supply 1800 W of which between 6 and 12% will be lost in the wire (Columns A and A’)


#14       A        A’          B         B’       C          C’
L         100'     200'       100'      200'      100'       200'
Ohm        0.5     1.0        0.5       1.0        0.5       1.0
A          15      15          15       15         1.5       1.5
V         120      120         12        12         12       12
W        1800      1800       180        180        18       18
V drop      7.5     15.0        7.5       15.0      0.8      1.5
Vdrop %     6.3%    12.5%      62.5%     125.0%     6.3%     12.5%
Power lost  6.3%    12.5%      62.5%     125.0%     6.3%     12.5%

At 12 V, 15 A, the power delivered is 180 W of which between 62% and 100% is lost in the wire and the voltage delivered to the load will be anywhere in the range between 0 and 12 V, depending on the load. Both efficiency and voltage regulation are dismally bad to the point of being useless. (B, B’)

If we want to maintain efficiency and voltage regulation at the original levels then we must cut the current back to 1.5 A which means the max power delivered to the load would be 18 W. (C, C’)

So, in parallel with the 120 V circuit we have added another circuit with the same cost in copper wire, effectively doubling the cost of the wires, plus add the cost of the DC power supply, so we have more than doubled the cost of the wiring and all this to add another 18 W, which is 1% of the 1800 W the 120 V circuit can carry.

Does someone still think this is a good idea? Really? More than double the cost of the electric installation to add 1% capacity? With a DC system which can only deliver 18 W? Really? When a small brick type unit plugged into the AC can deliver anything you need up to several hundred watts? Really?

This is like shooting fish in a barrel

This confuses me. I would have thought that a certain voltage was needed to produce an electron-hole pair, and that current would simply be related to the number of electron-hole pairs. I would have thought that without a certain voltage, you couldn’t make an electron-hole pair at all.

You’re correct as far as I know. You need to exceed the forward voltage of the diode for it to turn on (1.7 - 2.2V in my limited experience with them). I think DrCube must have been referring to the current-limiting resistor you add to stay below IF max.

It seems Alex is stil not convinced but note that the trend in other countries has been to go UP in voltage, not down, in order to make power distribution cheaper and more efficient. Spain changed from 120 V to 230 V.

As they say: the good thing about standards is there are so many of them to choose from.

With DC switching circuits becoming more viable and efficient I can see how DC could become an attractive alternative to AC but the main constraint about power losses in the line remain which makes higher voltages more attractive so my prediction is that if DC is ever used to distribute power in the home it will be at a higher voltage than 120 V, possibly 230 Vdc or even 300 Vdc.

But there is no way to distribute electrical power efficiently inside the home at 12 V. Any boat or RV owner who deals with 12 V instalations knows that. The mast of my boat is 50’ tall. Add the distance from the foot of the mast and multiply by two and you have 150’ of wire for the masthead light. In a situation where a very small % drop in voltage produces dramatic drop in luminosity.

And that would cause another set of safety problems. :frowning: Arcing is a significant problem in home wiring, and it would be a worse problem with DC vs. AC.

I am not saying it will happen. I am saying that 230 Vdc is more likely to happen than 12 Vdc.
And note several things: in AC the peak voltage is 1.4 times the RMS so for the same RMS voltage AC is more prone to arcing. With dc peak = RMS.

Also take into account that 110 V systems are mostly unchanged from 80 years ago whereas a new DC system would have everything designed ad hoc so I cannot see this being a problem.

In any case, if we ever see central dc distributed around the house I predict it will not be 12 V. I cannot imagine it would be less than 110 V and more probably closer to 230.

Right. But with AC the arc is extinguished during the zero crossing, and the arc often does not return in the next cycle. It usually takes thousands of arcs with AC to build a thick-enough copper oxide bridge to produce a filament & glowing contact. So while it is still a problem with AC, the fact that there is a zero-crossing every half cycle helps to mitigate the problem. Not so with DC. There are no zero-crossings with DC, hence (all else being equal) it is much easier to sustain a DC arc than an AC arc.

That’s a good point too. My experience with DC breakers has been that it’s a good idea to keep some fuses in the circuit, they fail often. DC switches too, tend to break down a lot. There’s a reason DC drives and switches tend to be larger than AC devices for a given horsepower. I like the idea of power over Ethernet though and hope it becomes more popular but I don’t envision DC power supplies in our walls the way the OP has described. EMI, tougher to interrupt safely, voltage drops, there are a lot of obstacles and the electrical code is based on a lot of worst-case scenarios.

Well, I am no expert in big Amps and big volts so I would have to study the subject more in order to have a better formed opinion. I think dc @ 12V is totally out of the question and at voltages between 120 and 300 V it would be feasable in that the issues could be overcome with relative ease by special designs. I think there would be certain advantages in using DC for certain applications but whether these advantages outweigh the disadvantages is very doubtful.

I agree that the rat’s nest of power bricks and thick AC cords and strips under my PCs make no sense. What would make sense would be putting the AC-DC power supply inside the UPS and feeding DC from it to all the different components and peripherals.

When looking at the overall cost vs. overall benefit, I see no advantage to using DC in a home’s electrical power distribution system. I think it’s a dead issue.

This has drawbacks you may not realise. For instance, try using the 12 Vdc from the computer for the external speakers and you will get a ground loop. You need a separate power supply which is floating with respect to the computer.

Also, suppose you have a short in the speakers. Do you want that to shut down your computer?

Separate, floating, power supplies have definite advantages over a unified system.

Yes. You are most probably right.

Why is that? Honest question, I don’t follow. I see a simple parallel impedance loop but I’m probably missing something.

When the spec for my fantasy UPS gets designed, the PC voltage rails will not be shared to avoid such problems.

In that case you are pretty much just including a separate power supply in the computer case which means it is inside and out of the way but takes away flexibility. Not any gain really.

Thank you for illustrating in clear terms why 12V distributed centrally across the house could not work.

I think the best proposal would be to distribute a very high voltage (240V-480V), as you say, across a house (and to it!), but downconverting (to 48V or 12V) at the level of the room or socket. High voltage at the socket, I’m afraid, is no-go because they’re too much of a pain to work with and design around.

Why couldn’t there be any sort of solid-state fuse, current-limiter, etc. in the loop?

But really, to get anywhere with the analysis of this proposal, we need to know the costs of the voltage-converters and chips and fancy-pants mechanisms. On the one hand we have the contractors and electrician union members who shop at home depot and claim the simplest outlet will cost hundreds apiece. On the other hand we have Chinamen who can apparently manufacture anything for a quarter, because it’s all built around a commodity 5-cent integrated circuit (that can do anything) plus a couple caps and coils.

So while we know DC-DC converters are immensely compact,* reliable, and efficient, we are stuck at the question of, well, can we put them in every outlet or not?

(*sailor, you called me on the tiny module that supposedly could do 200W but in fact did 60W, but here’s another converter that can truly convert 300W in a cubic inch at 95% efficiency with 3.5 million hours MTBF. Btw, these modules also come in a version that turns 384VDC to 12VDC.)