Why the Sam Hill don't I have DC in my house?

BTW:

No, I don’t. Not that an idea’s popularity has much to do with its merit anyway, but even so, I’m not alone.

Right and now the world is stuck with this weak DC output and cell companies are standardizing it for charging. We’re also stuck with sub-par transfer rates until version 3. We’re stuck with silly non-keyed connectors. We’re stuck with all manner of connector: mini, regular, a, b, 4 pin, 5 pin, etc. I prefer ps2 mice and keyboards.

I think the ultimate problem with your proposed solution, ignoring its expense and that no one else seems to care about DC power, is that it might solve today’s problems but not the future’s. Revamping my home just to avoid today’s unsightly bricks is short -sighted. Tomorrows killer toys may run 3volts and we’ll all need bricks to step down from 12 to 3. Or they will use 48v like the phone company uses today.

Ultimately, you cant mandate voltages. If my company designs a low-power chip that runs at 3.3v then thats what we’re running it off of, regardless of whats in your home. It would be wasteful to run it at 12v, or we’ll need a internal or external step down to 3.3. Essentially, youre dictating that all embedded devices, consumer electronics, applicances, etc run with one voltage. That aint never going to happen.

Some do. Many don’t. And, in any case, most of them still have internal conversions, both up and down. And those devices you are thinking about use minimal power. A mobile phone might use a couple watts which is nothing and it would be feasible to have a 5 V, 2 W supply but that would not really be useful in general terms.

Because with the system we have now you buy only what you need and specifically what you need while with your system you buy twenty or fifty times more than you need and it may not even be what you need. Plus now when an adapter fails you go and buy a new one without needing to do major home renovations.

You are quite mistaken. PC power supplies handle hundreds of Watts. My (rather old by now) laptop uses 80 W, add speakers (which can be a few watts or tens or even hundreds), modems, routers, etc and you can see that (1) the unit would be very large and (2) a tiny fraction would be actually used and (3) thyey would be a constant source of problems and a constant power drain unless they were switched off.

Look, I do not know your qualifications but I am an electrical engineer who makes a living actually buying and selling electronic components. So if you have relevant cites showing that I am wrong we will all be happy to see them but up to this point you have not proven you have more credibility than anyone else here.

Ah, yes. I have met many people “ahead of their time” on the Internet. Many who had discovered free energy, perpetual motion, revolutionary inventions, impossible infinite encryptions, etc. The Internet is full of people like that.

See, you shoot your credibility to hell right there.

Many voice modems use 9 or 12 V AC. You can see the power supply in ebay items 320390462046 , 270220681397 , 200257608661 ,260434337010 and many more.

In fact, right now I am in need of two 9 Vac , 800 mA supplies for two TP-Link WR340G+. Anyone have a couple for me?

So you lose all credibility when you do not know these things.

Just getting the house to pass inspection with a bunch of electronics buried in the walls would be a nightmare, if not impossible. And the resale value would suffer greatly. Who wants all that crazy stuff?

Look, I can see you are pretty invested in the idea, but you are not listening to a lot of people who, as far as I can tell, know more about these things than you do.

O.K., I admit I haven’t read all the posts. So I apologize if this has been mentioned.

Yea, lots of things run on DC. Including almost every “electronic” device. Here are two problems you must contend with:

1. The DC supply (or supplies) for an electronic device usually has a pretty tight tolerance, e.g. ± 0.5 V.

2. An unpredictable voltage drop exists on the power wires running through your house.

So even if you had (for example) a 12 VDC system in your house, the voltage at each outlet will be a function of the voltage drop along the wires, which is a function of the current. In other words, the voltage at each outlet will be difficult to regulate; unless you use #0000 wire, you can’t guarantee there will be 12 VDC ± 0.5 V at each outlet.

So… a device running on a DC system would probably need to convert the DC voltage at the outlet to a nice, regulated, known DC voltage. This means electronic devices would need DC-to-DC converters. At that point, you’ll have to explain why your systems is so much better than AC. :dubious:

As pointed out many times, USB is the closest thing we have to a standard DC power connector. And it became commonplace without needing DC outlets in walls.

I thought FireWire was daisy-chainable. USB is, uh, hubbable? Whatever, providing DC power on the same cable reduces clutter. Separate standardized DC cables, not so much.

And what exactly is the problem with having DC power close to the data line?

As I said earlier, each AC adapter is optimized for each peripheral, so it’s quite efficient. Having an oversized DC power supply in the wall is far more redundant. Especially if you also have DC power supplies on other outlets that aren’t used at all.

Many of them do. The AC adapter for my Altec Lansing speaker, for example, gets warm enough that I warm my feet on it in winter, even when the speaker is turned off.

Yes, if the problem it solves is worth the effort. I just don’t see it happening for DC wall outlets.

Again, you need to check your facts. Anyone can tell you that power supplies get warm even when not loaded. Power supplies are most efficient at their rated load and are very inefficient when lightly loaded or with no load. The energy wasted by having your power supplies unloaded or very lightly loaded would be phenomenal.

About 13 years ago I measured the efficiency and power dissipation (in the form of heat) vs. output load, on a power supply in our lab. The results:

  1. As the output power of the power supply increased, the efficiency of the power supply increased.

  2. As the output power of the power supply increased, the power dissipation from the power supply (in the form of heat) increased.

At the time I was somewhat confused at these results. But now they make sense to me.

Nonsense.

Once you have DC current, you can use a much smaller and more efficient circuit to convert it to any other voltage. (This 120W DC-DC computer PSU runs off 12V, outputs a whole bunch of voltages, and fits into the ATX socket!) It’s an ironic flipside to what used to be the advantage of AC.

Speaking of other cool power-related ideas: Those weird, seemingly useless Firewire ports have a big advantage over USB. They can put out variable voltage (up to 30V) at a few amps, and can power almost anything, even printers and monitors, all on their own. But there’s licensing fees and USB is just too entrenched.

Not necessarily.
Power supplies that meet Energy Star 2.0 specs have a no-load power consumption of .5W - hardly “phenomenal.” This applies to all power supplies from 0-200W rated output. From here.

There’s so much negativity on these forums. You guys would never see a good, sensible idea if it hit you in the mouth.

Of course, the impediments to getting it done are huge. If we were to implement that, I think we’d first try to get the whole world (or even the 90% that’s on 220V) to use the same damn socket. It’d be nice to say this idea could grow out of the USB standard (as more and more items are using it for power), but USB is too low-power (just a handful of watts).

That costs money and even .5 W multiplied by millions of units adds up. A regu8lar 110V outlet consumes nothing. Also, what is the efficiency of such a device loaded at 2% or 5% of its rated capacity?

And again, how many people want electronics buried in their walls? What do the building and electrical codes say about this?

There is zero demand for this. There is zero interest in the engineering and building industries for this. Is it a conspiracy? Or is it that it does not really make sense?

Why are you spouting such garbage? Just because you don’t see the value, don’t put words into the World’s mouth. A lot of devices are trying to standardize around USB +5VDC for power. China in fact passed a law requiring all cellphones to use it! (Because it recognizes the anticompetitive and inefficient practice of selling expensive, single-purpose wallwarts.) Hotels are putting USB into their walls. Clearly, you are wrong saying there is “zero demand.” Now cool the fuck down. Why can’t we discuss this idea without so much hostility?

Hostility? Oh the irony.

I know full well about China requiring USB chargers for phones and have posted about it several times here. In fact I probably was the first to post it and have repeated it several times.

WTF this has to do with what the OP is proposing I have no idea. Neither China nor anyone else is requiring DC in buildings or anything similar.

I’m not saying that this idea makes a lot of sense.
However, I don’t necessarily buy your idea that this would be some huge efficiency debacle. There are inefficiencies everywhere - right now they are distributed throughout the house, in all the zillions of power supplies. It’s possible that one centralized power supply would be more efficient - I don’t know, and without doing a usage study, it’s impossible to say.

The chinese mandate is a bone-headed move. 500mw is almost a useless amount of power. I cant power an Xbox360 with nor my laptop. I cant power a bright light. I cant power a space heater or a hair dryer. I cant charge my laptop in any sane amount of time.

I was just at an airport full of usb charging slots and not one was used. I tried it out just to see if they work. So it was an airport full of USB chargers and one single guy using it: me.

I could also see a very simple hack here. I couple toss a very small computer behind the wall, connect it to the usb, and tell it to copy all your documents off your device when you plug in. Oh, and Ill drop an exe, make it auto-run, and run my favorite trojans. Yeah, like I would trust the Chinese not to pull tricks like these.

The “demand” for this is borderline fictional. I guess its good in emergencies, but as an AC replacement its really a waste of dollars or yuan.

Now, wait a minute. As far as I know the Chinese government has not ordered the instalation of USB supplies in any building. What they have done AFAIK is mandate that cell phones which use 3.6 V batteries normalize their charging voltage to 5 V and the phone connector to mini USB so that phone chargers would be compatible among phones. This seems pretty reasonable to me and I cannot see any downside.

It is very possible to truthfully assert that power supplies have max efficiency near or at their rated power and have their worst efficiency when very lightly loaded. Suppose you mandate a 200 W DC power supply at each point. Most of them will have no load at all or have a very light load for their capacity. This would be much more inefficient than having power supplies sized for their intened use.

Plus all the other points which have also been mentioned.

Yes, efficiency goes to zero at zero load, but that’s irrelevent. What’s important is the no-load power wasted by the power supply.

I agree. See my previous post.

What about a 200 W power supply loaded with 5W and a dismal efficiency?

Again, this is just academic. There are plenty of other reasons why this is not a viable idea.