Can someone explain power adapters to me?

The power adapter on my laptop failed recently and I’ve been going through the process of getting a replacement which involves a lot of money ($35 - $70) and a lot of time (they don’t carry them at retail outlets).

Over the years I’ve collected a box full of these things so I thought that surely I could dig through my collection and find one that matched (Output 19V, 3.42 A), but no luck. In fact in the entire box, it was hard to find any two that were alike.

I know very little about electronics, but I am surprised that there are so many flavors of power adapters. I would have expected the industry to have settled on a handful of standard specs by now.

So what’s up with these things?

Why are there so many different kinds?

Is it OK to fudge a bit on the specs?

How do the darn things work anyway?

They convert AC (Alternating Current) into DC (Direct Current). In the simplest manner, first a transformer steps the 120 AC down to a lower value of AC, and then a rectifier (4 diodes) converts the alternating current to direct current, and finally a resistor and capacitor smooths the DC into a flat line. There may be other components to prevent overloading and suppress spikes, but that’s the gist of it.

Power supplies are designed based upon the power requirements of the end unit, not the other way around. You wouldn’t begin with a 19V, 3.42 Amp power supply and then figure out how to design a computer that would require that much power.

Sorry, and yes you might be able to “fudge” in another supply. For example, the current rating is the maximum that a source can supply, but the end unit “decides” what to draw. So if a supply is rated for 5 Amps and your computer only draws 3.42 Amps then this shouldn’t be an issue. Voltage is pretty specific though. You might get away being off by a volt or 2 because typically there would be a voltage regulator in the computer, but I wouldn’t want to do this for an extended period since it could cause electrical over stress in components.

Sure. So maybe the real question then is how do they figure out what the specs should be. Also if I add a device, say a DVD drive, won’t that change the power requirements?

Yes, it will. So it’s necessary to allow for that when designing the power supply. It’s no use having a power supply that only just barely provides enough current to drive the basic laptop.

You didn’t specifically ask, butiGo by Radio Shack makes a universal power adapter supply. I have the wall and car charger that works for my cell phone, and can be used for PDAs, MP3 players, etc. The initial investment was $30, but new tips are only $10, and the first one is free, so if you go through three phones a year or so (like I do, not by choice), it’s worth it. The ones that are powerfull enough for notebooks are a bit pricier, but you might want to look into it, as depending on your needs, could save you money in the long-run.

Power supplies for a lot of electronic devices do tend to stick to certain common voltages. Basically, a battery is 1.5 volts, so if the thing takes 2 batteries then it will have a 3 volt AC adapter, and if it takes 4 batteries it will have a 6 volt adapter. These AC adapters are designed to be as cheap as possible. They have very little filtering and often no voltage regulation. Generally speaking, as long as you match up the voltage, the connector type, the polarity of the connector, and the AC adapter can supply at least as much current as the one it is replacing, there’s usually not a problem. You don’t want to replace a 200 mA adapter with a 5 amp adapter, though, because often the voltage regulation in these things is really really crappy and relies on a certain reasonable load to function properly. A severely underloaded AC adapter may end up with a really high voltage output, which could possibly damage your device. There’s usually no problem with replacing a 200 mA adapter with say a 500 mA adapter, though, as long as the voltage, polarity, and plug match up.

One thing that goes against standardization is the fact that Sony wants you to buy the Sony AC adapter (for $30) instead of the equivalent Wally World adapter (same output, $3). So, Sony intentionally puts an oddball connector on the end of their adapter. I’m picking on Sony and exaggerating a bit, but a lot of companies do this.

The quest for smaller and smaller devices also hurts standardization. An ipod, for example, is so thin that the typical barrel connector used in AC adapters is just too fat be used in it.

For devices that really draw a lot of power, like laptops, the standardization really isn’t there. Laptops, being complicated digital devices, often have very specific voltage requirements, and sometimes can’t tolerate the poor regulation and lack of filtering common to generic AC adapters. Laptops often also use switching power supplies, which are a lot more efficient, but can be electrically pretty noisy. The laptop may be designed to filter off the switching frequency for the adapter it is designed to use. It may not have the filtering required to operate properly if a different adapter with a different switching frequency is used. A linear regulated adapter wouldn’t need to worry about switching frequencies, but linear power supplies are a lot larger, a lot less efficient, and run a lot hotter than switching power supplies.

The typical “wall wart” power supplies are usually pretty much as Leaffan described. They are just a transformer, often a single diode (not the 4 diode bridge, though you will find those also), and a capacitor for filtering. Because there is no regulation, if the voltage at the outlet is 10 percent low (the power company usually only guarantees +/-10 percent) then the output of the AC adapter will also be 10 percent low. Your 6 volt AC adapter could be anywhere from 5.4 volts to 6.6 volts depending on where you plug it in. You could also find that it is 7 volts without anything attached to it, and it’s only 5 volts (or less) at full load due to the thin wires and cheap components used in it. The capacitor is typically a bit undersized, so there is a lot of “ripple” in the adapter’s output.

Better (and more expensive) AC adapters will have voltage regulation circuits in them, so that the 6 volts out is always 6 volts even if the AC side varies a bit. The two types of regulators are “linear” and “switching”. Linear regulators are much less efficient, make a lot more heat, but they have a much more stable and less noisy output. Switching regulators often don’t work properly unless they have at least a 10 percent load on their output, and they are electrically as noisy as all hell. But, they are much more efficient and run a lot cooler. Almost all laptop power supplies these days are switchers just because of the efficiency issue.

You can google “switching regulator” and “linear regulator” for more detals.

I would advise against doing even this. You really should use and only use the power adapter designed for your computer.

And yes, it’s totally effing stupid that there isn’t a standard. Even among manufacturers of the same brand, adapter specs vary wildly, and for no damn good reason that I can discern. AC power supplies for towers and workstations are very standard; it seems like no mean feat to do the same with laptops (as Apple has done with their line for years).

Stranger

So how do they come up with the final requirements? How do they know what devices a user might add on? And if it’s basically a worst case scenario, why so specific, e.g., in my case 3.42A, not 3.4 or 3.5 but 3.42?

Well, of course they know what devices you might add on to the ports they provide. They would factor worst case into the design. Why 3.42 Amps specifically? I’m not positive (a little power supply joke there) but it could be a regulatory (UL) requirement to specify the maximum Amperage that can be safely drawn, and not a general rounded-off version.

I should have been more specific. Many external devices like printers will have their own power supplies, and so these can be ignored when designing the laptop power supply. External devices without their own PSU generally are designed not to draw much current, and with the laptop’s power supply capabilities in mind. And of course there’s only so many external ports that you can connect stuff to, so there’s something of a limit on how much you can connect directly to the laptop.

As for the 3.42A, it’s not necessary to be that precise, it’s just that that happens to be the maximum current your power supply can generate. But it could just as easily have been 3.58 or 3.61 or any other value in that region.

Why do they humm?

I suspect in something like that, the guys in charge of the computer design looked at all of the different parts, drives, etc. that could end up in the computer (or attached to it via USB or whatever) and said “eh, it will draw about 3 amps max”. The power supply guys said “well, we’ll stick a little extra in there just in case” and ended up with a supply that is capable of about 3.5 amps. Then, just to make sure no one ever blew anything up, they measured the max output before something inside went poof and got 3.42 amps, so that’s what ends up getting stuck on the label.

It’s also possible that they put 3.42 amps on it to confuse anyone who might be tempted to replace it with a more generic 3.5 amp power supply.

Humming usually comes from transformers. Transformers are just coils of wire, and the coils generate a magnetic field which makes the wire want to move. So, the wires and the core of the transformer jiggle back and forth at 60 Hz (or 50 Hz if you are in europe). If the wires and the transformer bits aren’t glued down very well then they can make a very audible hum. Sometimes the parts aren’t glued down very well because the guys that make them are cheap bastards and want to save a hundredth of a penny on every unit they make by using less glue. Sometimes whatever glue they use just doesn’t work as well when it ages, and the parts start humming.

Switching power supplies often have much smaller transformers working at much higher frequencies. Usually these frequencies are above what humans can hear, but sometimes they aren’t and you can hear a faint high pitched squealing noise from the power supply.

120 Hz (or 100 Hz in Europe); each complete cycle produces two vibrations.

I’m not quite clear on this. What exactly is “Switching” in a switching power supply? :confused:

I’ve been into more than a few of these, and have never seen a single diode version. I did make the mistake of designing something that way once, though, and I’m pretty sure I know why they are not made that way: A single diode forces the transformer secondary to carry DC, which causes the core to saturate, which eventually over heats and dies.

A full-wave bridge is cheaper than overbuilding the transformer to tolerate the DC, and you can use a smaller (cheaper) capacitor to filter the lower ripple from the fullwave rectifier. Center tapped FWR is also rare, as diodes are cheaper than the extra wire of a CT winding.

In general, the front end of the switching supply converts the incoming AC to a rough DC. An oscillator circuit called the chopper switches this rough DC on and off at a high frequency, typically 10-20 kHz producing a rough square wave. Small inductors are then used to change the voltage and feedback circuitry regulates it to the desired level.

The size & weight of a transformer, and other power supply components varies inversely with the frequency of operation. Thus, for a supply operating at 50/60 Hz, there is a strict limit to how small and light you can make it.

To get around this limit, the 60 Hz power is rectified and filtered to produce DC. This is then fed to one or more transistor* switches which produce high frequency (on the order of 100 KHz or even more these days). Thus the supply benefits from the size and weight reductions of operating at much higher than 50/60Hz.

*Other switching devices have been and can be used, but transistors are by FAR the switching devices of choice in consumer scale supplies.

I see on preview, QED answered the question, but the next question will be why bother, which I addressed, so I’ll go ahead and post the redundant info.

I’ve seen a few single-diode supplies. These normally have gapped cores, which is the only design consideration needed to prevent DC saturation. Gaps do result in a lowered coupling constant, but this can be dealt with by altering the turns ratio(s) as needed.

I have a small army of transformers in the recording studio. 4 of these are the same model -I have 4 compressors of the same model, each with a transformer. They were bought all at the same time.

Two of these transformers I can hear, the other two I can’t. One has become quite loud (loud in a recording enviroment is a different kind of loud than in, say, a living room). Add this to about 5 other older transformers that I would like to be less hummy.

At the moment, I strategically place these transformers so I can’t hear them. It would be easier to just make them silent - or close to it.

So, one could perhaps fix a badly humming transformer by adding a layer of epoxy or epoxy putty over the coils?