DC Adapters

chukhung you obviously do not have a clue. You are talking of thousands of dollars which would be mostly wasted. Do you really want to have dozens of outlets in every room. Or maybe all appliances would use one connector with many pins and use whatever ones they need? How much current would you provide for? You would need many many amps available at every voltage and the immense majority of all that installed power would go unused. To feed a computer you need different voltages: 5V, 12 V, -12 V etc. My printer works at 8 V which is hardly standard. My laptop at 18. How many voltages would you have? You would need to have the capacity for several hundred amps at each voltage. Wires would have to be humongous to avoid any significant voltage drop. There’s no way in the world this can even begin to make sense. It would mean a cost of thousands of dollars to save a few adapters. Can you show me just one source who knows what they are talking about who has proposed or defended such a scheme? An EE or an architect or even a contractor? I’d like to see it.

A bunch of wall outlets in various voltages isn’t practical, but there is one aspect of this that I wonder about (so did Douglas Adams - several years ago I saw an article by him on this topic):

Every damn thing you buy has its own adapter with its own silly connectors. There’s a fairly small number of output voltages involved (a few devices also seem to put only the transformer in the plug and the rectifier in the device so that the “wall wart” outputs stepped down AC, but most of these things output DC).

Logically, you should not have to keep a unique power dongle with each and every thing you own. They should be a standard thing you buy when you need one, like light bulbs. They could come in a small fixed set of voltages, with sufficient current ratings for most electronics. You could even come up with a variation in connectors on the basis of voltage to keep people from accidently plugging in the wrong rating.

You CAN get universal DC adapter gadgets that allow you to set the voltage to a number of levels and are equipped with an octopus arrangement of various connectors. I have a couple lying around somewhere. It’s futile, though - I remember getting one for something like a portable CD player - it had a lovely selection of six or so plugs on it, and, of course, I found that the device manufacturer had used their ingenuity to invent another type of plug that the universal adapter didn’t have.

And it seems to be getting worse - devices are coming with all sorts of wierd ass plugs for their power dongles these days. If nothing else, you would think standardizing this would reduce manufacturing costs.

It costs about the same to manufacture them in quantity no matter what connector you put on the end. If you put a standard connector on it, then anyone can go to ye ol local electronics store and buy an adapter when it breaks. If you put a fancy shmancy non-standard connector on the end, it may cost you a few cents more to manufacture, but then they have to come back to you to buy a replacement (which you then gladly sell them for 3 times the cost of a standard wall wart so you can really rake in the dough).

I had a feeling this argument would come up. Maybe, but I somehow doubt that most electronics vendors sell many replacement wall warts. What MIGHT happen is that a lot of otherwise functioning small devices get pitched in the trash because the power dongle is broken or lost[sup]1[/sup].

OTOH, if these things were standardized they could REALLY reduce costs by packaging their device without a DC adapter, just like things are packaged with “batteries not included”[sup]2[/sup]. They would start to annoy us by packaging things with “requires 9V DC adapter”, requiring a trip to the corner store or stealing a 9V adapter from something else before you could use your new gadget.

[sup]1[/sup] - I think more of them are lost than broken. That CD player I mentioned was without its adapter because I was using it in an office where I left the adapter plugged in. I changed jobs, and forgot to unplug the adapter and take it with me.

[sup]2[/sup] - I’m aware that batteries also have a shelf life issue, which may be why they aren’t included.

Well, sometimes a manufacturer has good reason (besides his economic gain) to want you to use his adapter. I have a laptop computer here which has a pretty non-standard connector. Now, I know what I am doing and I know what I can plug in there but most people don’t and they would probably screw up. Start by classifying power supplies by voltage, then by amps, then by ripple, then by isolation and then by whatever other parameters you can think of and you have so many combinations you just cannot standardise every possible plug.

If my laptop had a standard plug you’d get to many idiots plugging all sorts of things there to “see if it works”. Don’t forget most people do not have a clue of what they are doing or talking about. Look at the tone of the OP which, rather than assuming there might be a reason behind the way things are, it sounds like it is asking why engineers are such morons.

I have a Fuji digital camera and rather than buy the $45 fuji adapter I found a much cheaper one which works but that is my risk. If you are Fuji and someone calls with a problem you can ask “did you use our adapter?” and when they say “no, I wanted to save a few bucks” you can laugh at them histerically and say “screw you and the buck you saved, that will teach you a lesson for next time, we don’t support experiments, have a nice day”

But in very broad and general terms, there is a preference in connectors with the smaller diameters being recommended for lower voltages and larger for higher voltages. I think Radio Shucks has a chart for these recommendations which you can find online.

I have a whole shoe box of connectors I keep and I most often find what I need there.

And, just in time to make my point, here’s Johhny!

Sailor & engineer_comp_geek: You’re correct about some stuff, and off-base about other things.

First of all, Sailor is probably correct in that - right now - it would not be practical or financially advantageous to install two independent electrical distribution systems in your home (an AC system and a DC system). But the reason is not so much due to the extra wiring and outlets, but because there’s no standard in place, especially among manufacturers of electronic devices. More on this later…

I also believe Sailor & engineer_comp_geek are wrong with reagrd to the efficiency aspect. Not only would it be electrically efficient, but there would be other advantages as well(more later).

Hypothetical example: Let’s say I have a bunch of electronic devices that all require +/-12 VDC of modest current requirements (let’s say each device requiring below 10 amps). It would be more electrically efficient (but perhaps not more cost efficient, as mentioned above) to have a single, +/-12 VDC switching power supply run all the equipment. It would be difficult to argue otherwise.

Now some people (such engineer_comp_geek) claim that “AC is more efficient to distribute than DC.” This is not necessarily true; it depends what you’re talking about. Because when you’re looking at just the conductors, DC is more efficient than AC. Both AC and DC suffer equally from resistive losses, but AC has two additional losses: reactive losses and skin effect (when using very large conductors). Though both are small, they do exist. Even at 60 Hz.

But as we all know, the distribution grid is AC.

Why?

A little background… If you want to transfer electrical power from point A to point B, you want to use as high a voltage as practically possible, since (for a given power) the higher the voltage, the less cross-sectional area each conductor needs. (Note that this would be true for an AC or DC distribution system.) Of course, this means special insulation and voltage hold-off techniques must be used, but this is cheaper than using large gage copper and aluminum conductors. So the power companies boost the voltage coming from the generator, transfers the power to your house using relatively small and light weight conductors, and steps the voltage down just before it gets to your house. At this point, the power company must make a decision… it knows it must distribute the power at a high voltage, but should they use AC or DC? As mentioned above, DC does have a couple things going for it: it doesn’t suffer from reactive losses or the skin effect. But there’s a very significant drawback to DC: it’s difficult, expensive, and inefficient to “step down” before it gets into your house. And since AC naturally comes off the generator at the power plant, it’s expensive and inefficient to “step up” and convert the voltage before it’s put on the grid. By contrast, these things are relatively easy and efficient to do with AC using transformers.

So to transfer lots of power over long distances using electricity, you pretty much have to use AC in the overall scheme. But we’re talking about something different here. Using a single switched-mode power supply to deliver DC power to lots of devices (each requiring the same voltage) would be more electrically efficient than making each device responsible for it’s own AC-to-DC conversion. There would also be other advantages:

  1. Devices would be smaller, lighter, and run cooler.
  2. Devices would be cheaper, not only because they would have less components, but because of “more lenient” safety and UL requirements.
  3. Devices would be safer (no nasty AC around).
  4. The absence of an AC transformer would mean less induced AC noise in sensitive circuitry.
  5. Better power factor. (The big switch-mode PS should have a front-end PF correction circuit.)

I think such a system would be very cool. But until an inexpensive home distribution is developed, and until electronics manufacturers agree on voltage levels and interconnect standards, I don’t think we’ll be seeing a viable system anytime soon.

AC is more efficient for the simple reason that, being higher voltage, the amps are only a fraction. (And gimme a break with skin effect at 60 Hz or even at 600 Hz, let’s get real here)

Having 110V AC all over the house means you have the converter right where the load is. The converter can adapt to a wide range of input voltage so other loads on the line do not affect the output regulation. Having a central DC supply with equivalent regulation comes close to impossible. You could not power a computer directly with such poor regulation and you would need another local regulator which kind of defeats the whole thing.

So let’s see an example: 12 Volt supply, 100 feet length of #16 AWG wire (200 ft round trip), 10 amps consumption (doesn’t even begin to power a computer)… Let’s see, R= 0.546 ohm, voltage drop 5.5 volts… hmmmmm… I don’t think that would work.

Alternative: supply 1 amp at 120 volts. Voltage drop: 0.55 volts which are irrelevant as the power supply outputs the same 12 volts perfectly regulated. Losses on the line are 1/10th as before.

Wanna size the 12V DC line so the losses are much lower? be my guest. You can find the information here http://www.zelax.ru/english/sprav/awg_multi.html Let me know when houses start being built like this because I want to invest in copper mining.

And not that even if you size the 12 Volt line ten times larger, the voltage drop on the line is still enough that a regulator is necessary at the load.

Next you have to standardise all loads to take 12 volts which would be much more expensive and wasteful than any possible savings. In other words. it does not make sense no matter how you look at it.

I may have messed up. Let’s see if I get it right this time: 200 ft of #16 copper wire should be 0.8 ohms which at 10 amps would be a voltage drop of 8 volts so if you start out with 12 you end up with 4 volts and 2/3 of the power wasted on the transmission line… Anyone think that is a good idea?

I’m going to skip over this tangle of electrical engineering stuff to ask another question in the spirit of the OP:

Why can’t they put the power bricks in the MIDDLE of the cord, instead of at the end?[sup][/sup] They’re so wide you end up wasting outlets on power strips, and I have several bricks which are so heavy they won’t even stay in a wall outlet :mad:
[sup]
yes, I know it’s something to do with cost. Heaven forbid I’m actually able to use the thing.[/sup]

I have been watching this thread with interest, since I taught power supply theory many moons ago & repaired consumer electronics devices for over a decade. Having worked closely with OEMs & vendors I can say that there is certainly some validity to ** engineer_comp_geek’s** “comsumer loyalty” point. I know that I sold quite a few OEM power supplies at $30-$60 a pop, usually in cases when they were too far gone for me to fix at the component level (i.e. burnt to a crips). Service shops love this because there is a huge markup on parts & accessories.

But Yabob also has a point in saying that people felt that $30 was too much for a replacement power supply for a 2 year old portable CD player that can be replaced for $49.99. You’ll either buy their power supply or you’ll buy a new product altogether. Either way, you end up buying something.

Sailor has raised many valid points about in-house DC power generation, and I’d add to that the extra burden to the homeowner of having to purchace and maintain yet another major appliance that would probably be much more expensive than a heat pump: a power supply that would (probably) be the size of a refrigerator and get damn hot. This device would also be a hazard, because power supplies tend to catch fire when they fail. One this size might even explode, depending on what it’s made from and how it’s cooled. Then we have the fun of housewives changing fuses in 12V 100 amp circuits whenever they blow. Or did you want to call out a repairman every time a fuse blows?

Two reasons come to mind right away.

  1. They get hot, so it is not a good idea to have a power strip festooned with them. By placing the transformer casing at the end, it takes up power strip realestate and prevents you from plugging in to many in one tight spot.

  2. The weight of the transformer will cause the cord to sag, tugging at the connections on either end (possibly pulling them out of their sockets/jacks).

  3. You wouldn’t be able to run the cord under a carpet without an unsightly bump or behind a bookcase without moving it away from the wall four inches or so.

The right thing to do is replace the AC receptacle (it doesn’t grab the prongs properly) but you can apply a minor jury rig by slightly spreading the prongs apart a tiny bit, or by twisting them ever so slightly so that they become less of an “easy fit”. But do this too much and it just makes the receptacle looser over time.

Apologies to both the bloods and the crips.

>> Why can’t they put the power bricks in the MIDDLE of the cord…??

They can and they do. All the laptops I have had were like that. But, as you guessed, cost is a major consideration and the wall wart type is cheaper. Also if plugged into a wall outlet they are off the floor and out of the way. Another consideration is that the cables tend to suffer most where they enter the box and this way you only have one rather than two.

It’s even worse than that by a factor of 2.

There’s no doubt you’d need Big Conductors[sup]TM[/sup] to pull it off. But it would be more electrically efficient, which was my primary point.

By the way, is “DC adapter” the right term? I always thought they were “AC adapters.” I suppose they’re both as logical as the other.

A point no one has mentioned but that occurs to me (paranoid as I am) is that with both 12VDC and 120VAC coming out of the wall, there would be real chance that someone might accidentally cause a momentary interconnection between the 120VAC side and the 12VDC and fry every DC-powered device in the house all at once.

>> There’s no doubt you’d need Big ConductorsTM to pull it off. But it would be more electrically efficient, which was my primary point.

<<sigh>> What part of NO don’t you understand? It would NOT be more electrically efficient. You would have a big power supply on standby 24/7 using power for nothing. Having small adapters adapted to each load is far more efficient. Also, as I have said repeatedly, it is practically impossible to have all the voltages needed. Besides due to voltage drop on the line you’d still need regulators. Even if copper cost nothing, due to losses in the transformer and in the line, it would still not be “more electrically efficient”. (Or post some figures showing otherwise)

In any case, economically it makes no sense whatsoever. and it would be a PITA to maintain. One call to the maintenance technician would cost more than a dozen wall adapters. Not to mention the cost of the wiring.