DC Adapters

Why are they so frickin’ big? And why are things still made that run on DC anyway, when everyone and his mother is wired for AC?

Integrated circuits need DC. Since everything nowadays uses circuits, that’s a sufficient reason to not use AC. A nice side effect of this is that DC is safe. You’re not going to jolt your heart into stopping if you play around inside your walkman.

I’m sure an EE will be along shortly to correct me but IIRC most (not all) electronic components are actually DC inside the box and the conversion to DC is handled inside the box. Many small items can be made more efficiently and inexpensively if the power supply is external. As to size that can vary depending on whether the adapters are using a IC controlled switching or non-switching iron core type transformer for the power supply.

Many newer “switching” DC transformers seen in more expensive items like laptops etc are 1/3 to 1/4 the size of older iron core transformers and deliver the same power output.

“Everything” runs on AC now, only because the current is rectified to DC as soon as it gets past the “No User Serviceable Parts Inside”; exceptions would include space heaters and things like that.

EE checking in here…

“Everyone and his mother” is wired for AC because quite simply it makes power distribution a heck of a lot cheaper. You can crank up the voltage with a simple thing called a transformer (which is basically just a couple of coils of wire) and cut the wire losses down to a minimum, then drop the voltage back down to a reasonable level for your home with just another transformer. The same thing with DC is a heck of a lot more complicated.

Once it gets to your house, there are very few things that actually run off of AC. Light bulbs just convert electricity into heat and light. They don’t care much if it’s AC or DC. Same thing for electric heaters. Most things in your house run off of DC though, so they need to convert the AC into DC. Again, because you have AC coming out of the wall, all you need is a transformer and you can take it down to whatever voltage your device needs. Then you need to rectify and filter it. The rectifier is a small silicon diode (small and cheap), and the filter is a capacitor. If you open up a “wall wart” (slang for one of those little AC to DC adapters that plugs into the wall) you’ll see that a transformer, diode, and capacitor is probably all that’s inside it.

They are the size they are because a transformer is just two coils of wire wound around the same core. The transformer changes the voltage from 120 volts AC to 120 x n1 / n2 where n1 and n2 are the number of turns in each coil. The more power your device needs, the thicker the wire in the coils needs to be, which is pretty much what dictates the size of what the transformer is. Transformers can be made smaller if they work at higher frequencies, but we’ve standardized on 60 Hz power distribution, so that’s not an option.

A capacitor is just two plates of metal that are very close to each other but don’t touch. The problem is if you made them that way in real life they would be freakin huge. Instead of being small enough to fit in a wall wart, they’d be the size of a lunchbox. A small and cheap capacitor can be made by cheating a bit, and putting some special goop in between the metal plates, and instead of having just two big plates which look like two sheets of paper next to each other (very tall and wide, but not very thick), another trick used is that it’s rolled up like a newspaper. Now you’ve got a capacitor that will fit into the wall wart, but also has enough filtering ability to smooth out the AC so that it is fairly close to DC (you still get a slight ripple at 60 Hz in the power, but this is filtered off further by more smaller capacitors inside the electronics as needed).

So, the long and short of it is that the stuff has already been shrunk quite a bit. You aren’t going to get much smaller using simple, cheap components.

There are very few electronic circuits that will work with raw AC. Remeber that AC is basically a sine wave. It goes from zero, swings positive, swings back to zero, goes negative, then back to zero, all at 60 times a second. While the sine wave is near the zero part of its cycle, there is NO POWER available to do anything in the circuit. Anything that needs power at a frequency higher than 60 Hz (sound is 20 Hz to 20,000 Hz, video is 7,000,000 Hz, computers are up to about 1,500,000,000 Hz) can’t function because the power is essentially “switched off” (since the AC is at the zero part of its cycle) while it is trying to do its thing. There are very few practical things that run slower than 60 Hz, which is why almost nothing runs off of raw AC power.

Looks like it’s been pretty well-covered, I just wanted to add a few things :

While this is (and was) a topic for debate, there is nothing inherently ‘safer’ about DC (in fact, AC is safer in many cases, but I’ll try not to make this even longer). You heart doesn’t stop when you’re playing with your Walkman because there’s simply not enough energy available to do it. (and it’s actually much better for you if your heart stops due to a large shock and restarts than if it goes into defribillation due to a lesser shock – ‘better’ being quite relative here). Although there is something to do with safety as it relates to adapters.

This is that in addition to the points astro made, one other reason to use an adapter is to avoid bringing line-level (120V, high-current in the US) AC into the box. The generally recognized standards body for safety in the US is Underwriters Laboratories (UL), and there is a much tougher standard if you bring in the line voltage to your device (making it more expensive to develop). To ensure safety to your customers you can put a UL-listed adapter outside and bring in a lower-level AC or DC into the device.

IANAEE, but I think electrolytic filter capacitors are DC only? In any case, the ripple that is usually filtered is DC – after rectification by diode, what is left is ‘pulsed’ DC, (half wave?) the filters store up a given charge and ‘smooth’ the DC for use by the device

Another major reason for having external AC/DC converters, particularly in audio equipment, is to get rid of extraneous noise.

Transformers usually give off a good deal of electromagnetic radiation, which can be induced into and amplified by an audio circuit. So the solution is to have the transformer separated from the sensitive circuitry, either by shielding or by physical distance.

The simplest and cheapest way is to use an external transformer at the other end of a longish power lead.

That’s correct – electrolytic capacitors are usually polarized, and they must have a bias voltage to work properly. (The diode or bridge rectifier converts the AC to DC before the capacitor…)

That’s right. The filter caps in a power supply act as “energy storage devices.” A good analogy would be a water tower in a city’s water distribution system (water tower = capacitor). The load (or regulator circuit + load) takes energy from the filter caps, and the transformer and rectifier circuit “replenishes” or “tops-off” the capacitor’s energy 120 times a second.

I think one of the major reasons that electronic items use outboard converters is that it is cheaper to build it the same, wherever in the world it will be used. Then they can just throw in a cheap converter for the voltage and plug configuration used in the destination country.

I guess I didn’t explain this very well. A “wall wart” has three components, a transformer, a rectifier, and a filter capacitor. The transformer changes the voltage from 120 VAC (varies in other countries) to whatever the device needs (might be 6 volts, 12 volts, whatever). The rectifier can be half wave or full wave, but is usually half wave because that can be done with a single diode. A full wave rectifier is 4 diodes in kind of a diamond pattern. Diodes only conduct in one direction, so the output of the rectifier is the AC with the negative swing of the cycle cut off (in a full wave rectifier the negative cycle is re-routed to become a positive cycle). I guess you could describe this as “pulsed DC” although that tends to make me think of something that is more of a square wave, not half of a sine wave.

Electrolytic capacitors are polarized, because they take advantage of the electrolyte and it effectively becomes one of the two metal “plates” that makes up the capacitor. They take one sheet of metal foil, slap some goo on one side, and roll it up like a fruit roll up. If you apply a negative voltage to this configuration they tend to explode and spew all of their electrolyte all over the place (which is generally considered to be not a good thing).

The capacitor kind of acts like a battery. It charges up when there is voltage present, and discharges when the voltage goes away, so it tends to smooth out the half sine wave. But, it doesn’t do a perfect job, so you are still left with a little bit of ripple. The bigger the capacitor the smaller the ripple, so there is a design trade off. Typical wall warts have small capacitors, which means they have a fair amount of ripple (they are said to be electrically “noisy”). Electronic devices will filter this noise off as needed, but since this is a much smaller level of noise, much smaller capacitors and such can be used.

Perhaps it is a matter of semantics, but I would never say that electrolytics are used for DC. They are used for a clamped voltage, sure, or for a biased voltage, but I guess I would never call a positive AC wave a DC source… maybe fluctuating DC… anyway, I suppose it isn’t all that important. Just that most people say DC thinking “stable” not “anything that doesn’t pass the zero volt throughout its fluctuations.” KnowwhatImean?

Also, are the wall outlet transformers really that simple? Admittedly I’ve never been interested enough to open them up, but that is a piss-poor AC/DC converter that only slaps a diode bridge rectifier and a cap together. They should also have a zener diode and probably a series resistance, too. Still, admittedly, pretty simple stuff. I guess it depends on how clean the signal needs to be.

Now I want to go home and open up all thsoe damn things.

Non-polarized electrolytics are used with AC in some circuits, but in general, electrolytic caps are DC only. They smooth out half wave or full wave DC from diodes (tube or solid state), bridge rectifiers, etc.

Yep, the “wall warts” are pretty simple – As an aside, I’ve heard and seen a lot of equipment trashed by using unregulated AC adapters. Say your portable radio requires 12 volts DC @ 50 ma; poking around the house, you find a 12 volt @ 500 ma wall wart. It will work – but since the radio doesn’t have near the load the unit is rated for, the actual voltage output might be around 15 volts or more. Use the same voltage (or slightly lower) with an appropriate current rating.

Wouldn’t it be more efficient for a home to have a central, regulated DC power supply instead of all those individual wall warts?

I guess it’s too late now, but if I could re-write the electrical code from scratch, I might think about how to have a parallel wiring system in homes. High-power AC wiring for lighting and appliances, and low-power, regulated DC that could be used for all the electronics.

That is really not a good idea. First, the DC losses are much greater. Second, each gadget requires its own different voltage so… Having an AC to DC converter for each works better

If houses had a DC power system at a specific voltage I bet that all the electronics would be built to use it. Just like electronics stuff is setup to use 120V AC in the states.

However we would still need AC power for some things. A 300W halogen lamp in your living room needs 25 amps at 12V DC. 25 Amps is a lot of current to run on a wire. Most circuits in your house are 15 Amps a few might be 25 Amps. People might want more than one light off of a given circuit. So you would need a completely separate wiring system in the house. You have to have a lot of wall warts to pay for the second set of wires in the house even if the wires are installed when the house is being built.

If each home had it’s own centralized DC rectifier, I’m not convinced that the losses from that one unit would be more than the combined losses from all the wall warts. I understand that long distance distribution of DC is inefficient, but within a house it wouldn’t be a serious problem, would it?

And as for each gadget requiring it’s own voltage, well, yes, that’s a definite problem with the current system. But if every new home in the country had outlets for DC similar to what you get with a typical PC power supply (+5V, -5V, +12V -12V, +3.3V) , I bet electronics manufacturers would pretty quickly standardize so they wouldn’t have to include a converter with every product they sold.

Of course, this is all castles in the sky. The current home wiring standards and electrical code won’t be changing any time soon–but it’s interesting to speculate on how they could be improved.

So you would have homewide distribution of 6 to 10 different voltages with pretty high Amps? Do you realise the cost of the wiring? And you would have 6 - 10 outlets in each room for different voltages? The whole thing makes no sense. You’d be wasting 90% of the installed power. Cost wise it would be a very foolish solution. The way it is now you have AC all over and transform it where you need it to exactly what you need in terms of filtering, voltage, amps, limits etc. Your way you would multiply the cost of installation and maintenance of home wiring by ten and you still do not have the same flexibility. It makes no sense.

It is very inefficient to transmit power at low voltages. Power is voltage multiplied by current, and losses in wire are directly proportional to the current. For example, if you want to cut your losses in half, double the voltage. For the same amount of power, this will cut the current required in half.

Westinghouse and Edison competed very agressively. Edison wanted DC and Westinghouse wanted AC. They went so far as to electrocute small animals to demonstrate how deadly the other’s systems were (something that the grade school history books tend to omit). The main reason Westinghouse won was that stepping voltages up and down on AC is a simple transformer, just two coils of wire wrapped around a common core. A DC to DC voltage converter is a lot more difficult.

Considering that electronics need varying voltages (sensitive digital electronics need a relatively low voltage, but the magnetron in your microwave and the tube in your TV need very high voltages), it is much easier and more efficient to use transformers as needed.

Edison and Westinghouse already fought this battle. Edison lost. Get over it. :smiley:

Well, this is an area that we’re just going to have to agree to disagree.

In my apartment, I basically have two rooms with electronics gizmos and all of them are already connected to a centralized power supply.

My PCs, LAN components, cable modem, VCR, DVD player and several other gizmos all run their AC power cords to power strips connected to a 1400VA UPS I strategically placed between the two rooms. I look at that wonder why I’m converting the AC to DC to charge the batteries in the UPS, then through an inverter in the UPS to produce AC again, and then back to DC in all of my electronic gadgets’ various power supplies and wall warts…

Maybe the current system is the most efficient, flexible way to do things, but I’m still not entirely convinced.