Why the Sam Hill don't I have DC in my house?

12 vdc, 24 vdc, and 120 vdc

Your proposal requires adding such an outlet to ALL locations In The Whole Wide World to make that practical. Considering we have 120VAC, 240VAC, 50/60Hz and maybe a few others, and in over 100 years of electricity has gotten us only this far towards standardization, your proposal just ain’t gonna happen.

Except that then you require one more voltage conversion inside the device which means one more electronic circuit and more energy conversion losses. Plus what amount of power would the outlet supply? Unless it is huge you cannot count on powering several devices. If it is huge it will be expensive. The whole thing makes NO economic sense. An outlet box with plug costs a couple dollars and a house can have several dozen. The added cost would be quite significant for something for which there is no felt need except by you.

You cannot be serious. I have had quite a few power supplies fail and I can guarantee a few would fail before the walls fail. Every failed power supply would mean opening the wall up. Not to mention that they would have to be rated for extreme fire safety, etc.

Again. How much power would each one handle? This is just beyond silly.

And , as I said, some devices use AC.

You are just showing your extremely limited knowledge of the topic. Many if not most external modems use AC. I have right now, on my desk a voice modem and a router which work with AC.

You seem to think you have a simple solution to a serious problem. The rest of the world seems to think it is a non-problem for which you propose and expensive and inconvenient solution. But you can try selling your idea. I just do not think anyone will buy it.

There’s an illuminating (aargh) discussion at the New York Times blogsite about Con Edison cutting off the last DC service in NYC. They supplied DC power from 1882 to 2007 - at the end, only to old buildings, and even there, only to certain systems such as elevators.

I think you received a near-perfect response in the first reply:

Exactly, plus you’d have the expense of more wires which would have to be routed away from the AC lines to minimize noise issues.

That’s the way things are right now, they’re called wall-warts, problem solved.

I agree.

That is an interesting article. For what it’s worth there are useful applications requiring High Voltage DC Transmission. We have an HVDC station here for transmitting power from one power utility to another where the systems aren’t synchronized. Other applications are mentioned on the wiki page.

But just about every outlet can conceivably be used for electronic devices. There are even computers and other electronic devices designed to attach to refrigerators.

Not really. Are you saying we’d have AC-to-DC adapters for buildings without DC outlets? Why not just have them in all buildings?

No need for a cite, just walk around your house and touch every AC adapter. They’re all warm, to varying degrees. Even the ones not connected to a device.

Yes, assuming every country in the world agrees to a common standard. And that standard happens to be a good choice for the device you’re building.

Consider the power supply inside a desktop computer. Those are standardized. But computer chips haven’t standardized to those voltages, because using different voltages have advantages - lower voltage for lower power consumption, for example. All computer motherboards contain voltage regulators that take the supply voltage and convert it to the appropriate voltage for each chip.

I’m not sure what your idea of a “particularly inexpensive home,” but around here you can get a decent house for $150,000. One tenth of one percent is $150. Do you honestly think you can get all outlets in a house wired for DC for under $150?

Some flashlights need specific voltages, notably LED flashlights - the LED is a semiconductor device and needs a specific voltage, you can’t just design an arbitrary voltage LED. So do many types of devices.

No, he’d simply include a 12V to 8V (or whatever) convert in the design. Which may well be more expensive than a 120V to 8V converter.

If you recall your high school physics, if you use lower voltage to transmit the same power, you need more current. Which means thicker wires.

No, you can build tiny converters that are very efficient, if it’s designed for a small load. What’s really inefficient is choosing a power supply that’s not optimized for the power requirements. If the bedroom has a 400W DC power supply in the wall, and is only used to power a 1-W digital clock, that would typically be much less efficient than a 1-W brick.

Wiring for DC would serve the same purpose - powering the same devices any AC outlet can power.

And easily replaced.

You still haven’t explained why this is a bad thing.

And many, if not most of them contain DC-to-DC converters to obtain the desired DC voltage. They don’t just fill it with components that run on 12 V.

It’s been touched upon that there is no one ‘fits-all’ DC voltage that could be a standard. For semiconductors, which are the heart of many electronics gadgets requiring ‘wall plugs’, the ‘common’ voltages are varied, constantly changing, and even more importantly trending downwards which makes things even more complicated.

Some time ago ‘everything’ ran on 12v. Then it was 5 and then 3.3. It is not uncommon now to have semiconductors running at several of 5, 3.3. 2.5, and 1.8, with 1.2, 1.0 and 0.8 on the horizon, if not here already.

Electronic devices are always going to have some form of internal (low voltage) AC-DC-multiDC converter and a ‘wall wart’ is a cost effective and potentially energy efficient way to deal with it.

I don’t necessarily see why he is confused. Transformers are often the first step in converting AC to DC. Step down the voltage (transformer), rectify it (diodes), smooth it (caps) and then regulate it with transistors or an IC regulator of some sort. It is a standard formula.

Somebody said something about LEDs needing a particular voltage, and I’m not sure where they got that. LEDs need a particular current. You can just use a resistor to set the current, regardless of the voltage.

All right, I’m done nitpicking. I want to say I agree with Jonathan’s point about designing for a few standard DC voltages. It can be done, just as easily as with 120VAC, but in each case, power has to be converted in some way.

I disagree, however, over his larger point about having DC outlets in the home. It really is a waste of money, as DC conversion is efficiently performed using wall warts and power supplies. The problem isn’t one of distribution, it is one of standardization. I’m really rooting for USB, or any standard for that matter. The issue isn’t too many wall warts, it is too many voltages/plugs/currents. Each device has its own characteristic power supply, and nothing is interchangeable, which is less than ideal.

Furthermore, I’m rooting on combined power/data interfaces. PLC, Power over Ethernet, USB… for some reason I find it extremely elegant when power and communications are carried over the same channel.

Or, you just use a resister to make sure the voltage across the LED is correct. Depends on how you look at it.

I was just using LEDs as an example of something you can’t just design to an arbitrary voltage. Perhaps not the best example, as LEDs are ideally run with a constant-current source and not constant-voltage. But it’s still easier to design an LED light system if you’re not constrained to a particular voltage. For example, if all you have is 5V, it’s somewhat tricky to drive a 3.5V Cree LED. Off-the-shelf LED drivers often require input voltage to be at least 2V higher than the output voltage, and that’s because of the way they operate, not because they’re designed for some arbitrary input voltage.

I think they are just answering the wrong question. Any question like this really has two parts:

  1. Why didn’t they do this in the first place?
  2. Why don’t they start doing this?

For (1), the answer is almost invariably because the new technology was introduced slowly, and had to coexist with the old technology.

For (2), the answer is almost invariably because the costs are far higher than the gains, and the current situation is sufficient if not ideal.

Really, the place where this makes the most sense is giant server farms. All the machines in there are using 12v DC or less, and one could eliminate the inefficiencies of all the power supplies in all of the servers if one just piped 12v to every machine.

Still, it would take huge copper buss bars, and, it might turn out that the I^2R losses would make this actually less efficient than just sticking with AC.

I think you’re right about the resistive loss being significant. This article describes Google servers; they use servers installed in shipping containers, each container containing 1160 servers and using up to 250 kW total. At 12 V, that’s 20,000 Amp of current.

Also, at least for the Google servers, they obviously achieve redundancy in software, not hardware - i.e. if one server fails, it doesn’t bring down the whole system, and you just have to replace that server. Each server is like one drive in a RAID array. So it makes sense to have an independent power supply in each server; if one fails, it won’t bring down the system. But if you have one 250kW supply in the container and it fails, it would bring down the whole cluster. So you’d need two, at least. Ever priced a 250kW power supply? Me neither, but I bet two of them (if not just one) would cost more than 1160 consumer-grade 230W power supplies.

No, this would not work. A better solution would be to have a general DC supply at a voltage of, say, 300 Vdc, and have each device step it down and stabilise it to whatever it requires.

The development of electronic devices have made switching power supplies possible and attractive. There has been a huge development in SMPS in the last decade.

What we are definitely NOT going to see is electronics buried inside the walls where they are not accesible.

Consumer grade PSU have a terrible power form factor which would be absolutely unacceptable when using so many together. OTOH, any PSU which starts with a bridge rectifier and capacitor can be advantageously supplied with DC at the peak voltage of the AC (About 155 V in a 110 V system or 325 V in a 230 V system). If you have to supply several hundred consumer PSUs then the most efficient way is to use 230 Vac PSUs and supply them at 325 Vdc. This maximises efficiency all around.

I agree with the sentiment in the OP, and though several posts point out great disadvantages there are still opportunities to do something like this.

Consider, as a model, GFI recepticles. These things try to detect a ground fault (meaning the hot and neutral lines carry different currents, indicating some current must be following an alternative, faulty route to ground) and kill the current if they see one. The point is, they’re almost fist sized and have electronics in them, but they still proved very practical to add to existing systems and replace if they fail. So, having electronics in the wall, at least in the recepticle box, isn’t that hard to swallow.

A similar size internal wart could take a custom cover plate and still provide the 120 VAC sockets too. I guess you’d buy a 5 V one or a 12 V one or whatever, and it would have a connector that was distinctive for the voltage, and you could order a cord to go from that to your laptop or whatever. Lots of little companies sell application specific cords in single quantity off of web sites. It isn’t hard to see this becoming available even with a tiny initial market.

Wall warts do waste energy even when not in use. Well, at least the ones I know and the ones other posters have tried feeling obviously do. But couldn’t a load detection feature turn off the more power hungry parts? In principle, and even in economic practice, they shouldn’t have to be wasteful.

Many things that use DC use whatever arbitrary voltage they happened to pick because they can. Many others can’t use any voltage but one very particular one, at least not with arbitrary ease. I agree picking standard voltages is partly futile, and distributing much power throughout a house this way is hard because of the wire resistance. Still, I don’t think it’s that easy an idea to dismiss.

Well, if you say so. Me, personally, were I to remodel my house to include DC outlets by some AC outlets, I’d probably skip the one behind the refrigerator. Wouldn’t seem to be much point. And the ones in the laundry room. And the garage. And probably in the kitchen… or at least most of them, anyway. I would estimate (although I haven’t counted) that fewer than one in ten wall outlets in my house has a power brick connected. I really don’t see why it would be required to add such an outlet to “ALL locations In The Whole Wide World” would be necessary for a practical benefit.

Never said it’d happen… whether or not something happens has only a correlative relationship to whether it’s a good idea. Nevertheless, I’d bet you’re wrong. Are you familiar with the Long Bets Foundation? I’d slap a few hundred on the proposition that within fifty years there will be a standard for in-wall direct current in the United States, if you’re game.

Or a device engineered to make use of standard voltages. Ever notice how a lot of devices manage to get by with +5V from a USB plug?

Um… can you please explain why an AC circuit which can handle having a dozen power bricks plugged into it could not provide the same amount of power through a single, unified converter? Most in-home AC circuits are 20A, and most electronics sip current to the tune of milliamps. Don’t need a whole lot of power to power a whole lot of electronics.

Did you calculate the cost, or just make that up in hopes that nobody would call you on it?

And as for nobody feeling the need except for me, what can I say? I’m ahead of my time. The offer for the Long Bet is open to you as well.

I’m sorry, what?? I’ve owned many an external modem as well as many a router and without fail every single one of them ran on DC. Let’s hear some model numbers here. It’s right there on your desk, after all. Bear in mind that we are talking about the home here… consumer-grade electronics. If you really do have consumer-grade devices into which you plug an AC power cable, I’ll bet that the very first thing that happens to that AC power inside the box is that it gets converted to DC.

Not that it matters since I’m not talking about eliminating AC outlets, anyway.

Please read more carefully. If we’re not running DC current through the walls (and I agree, now, that this is probably not the best solution), we can still have DC outlets in the wall, just doing the conversion directly from the AC line at the outlet itself.

Oh really. Wall warts solve the problem, do they? You’ve never had a problem with wall warts? You’ve never had to daisy-chain three power strips in order to plug in all the wall warts? You don’t see anything wrong with having, say, three external hard drives on your desk, and each one needs a heavy-gauge power cord to carry 120VAC to a power brick, each one needing to run all the way to the wall rather than being daisy-changed as low-voltage devices should be?

“Can conceivably” is a pretty low bar. I can conceivably desire to take a shower in my kitchen. That does not justify the cost of installing a shower there.

One would hope that we would have them in new construction, and in buildings whose owners chose to remodel. For existing construction where the owner did not feel like bearing the expense of remodeling, a standardized AC-to-DC powerstrip would be simple and inexpensive.

Sorry, I’d like something a little more scientific than that. Even the ones not connected to a device are part of a circuit to which devices are connected. Metal wire is an excellent conductor of heat as well as electricity.

I really, really would like to know why you guys are so obsessed with getting “all outlets” wired… but even held to that ridiculous standard, yes, absolutely. I’m sure of it. How much do you suppose an AC-to-DC converter costs, wholesale? How much less would they cost if they were being manufactured to standard specifications on a grand scale? How many do you think you’d need per house?

Eh, I’m bored. Let me know if anyone wants to take me up on the Long Bet.

If you want to daisy-chain computer peripherals, why would you even want a DC outlet on the wall? What you need is a computer interface standard that can also provide more power to the peripherals, so that the power supply inside the host computer can provide DC power to the peripherals. And I agree it would be nice if future versions of USB or FireWire could provide more power. Enough power to run a 3.5" HDD, for example.

But having a DC outlet on the wall would probably discourage such a move. It would encourage manufacturers to put two connectors on every peripheral: a DC cable that needs to connect to the wall (or a DC power strip), and a separate data cable.

Besides, most homes don’t have the problem you describe. A typical home computer installation consists of a desktop PC, monitor, printer, broadband modem, maybe a router, and maybe an external hard drive. Easily handled with one power strip. If you also have chargers for cell phone, digital camera, etc at the same location, that’s still one more power strip that can connect to the other outlet.

Fine, feel the AC adapter, then feel the power strip it’s plugged into. If the AC adapter feels warmer, that’s caused by the power wasted in the AC adapter.

Or just look at the various references listed on the Wikipedia page for standby power.

So let’s say you modify your house as you describe. Where are you going to get those devices and the necessary cables? All of the devices sold by my Wal-Mart come with AC power supplies, the DC end of them is not the same for all, and the voltages and current requirements are different.

Only if a significant number of homes (I would call a home one location, not a socket) have this kind of already-installed infrastructure will manufacturers produce devices to match. It’s a chicken/egg question.

You seem to be the only one who thinks this is a good idea. And maybe, one day, it will be. But for now, legacy appliances rule.

As a way of standardizing DC power connectors.

Neither USB nor FireWire is daisy-chainable. And it’s probably a safe assumption that the various consortia had some reason not to be carrying more than an amp of current next to their 5GHz data lines.

They currently need no such encouragment to do exactly that. A standardized DC power connector would allow them to implement daisy-chaining just by adding an additional (extremely inexpensive) connector for power out.

Easily handled? Six devices, four of which probably have wall warts (and thus are likely to “double park” on a power strip) just for the computer, and then another power strip for other devices… you really don’t see a problem with having seven devices all run to the wall, and seven different bricks all doing essentially the same thing? It’s redundant.

Okay. Now plug an AC adapter into a wall, and don’t connect anything to the DC side of the adapter. Does it still get warm? If it does, it’s by a negligible amount, certainly not by enough to be reliably detected by touch.

Travel back in time with me a decade or so. So let’s say you modify your computer to add some of these newfangled USB ports. Where are you going to get those devices and the necessary cables that use them? All of the devices sold by my Wal-Mart use RS-232 or PS2 connectors. It’s a chicken-and-egg problem.

And yet USB was adopted and RS-232 and PS2 thankfully faded away. Any new standard begins with zero implementations and zero installations. But that’s the cool thing about standards: if you build it, they will come. I envision devices being manufactured using the standard power connector but shipping with a wall-wart to convert AC to that standard… just like early USB keyboards shipped with both USB and PS2 connectors. As adoption is driven, manufacturers can start leaving out the wall warts.