Power adaptor question

For Mother’s Day, I got my wife a little CD player/radio thingie, so she can have some tunes while she’s working in the kitchen. Unfortunately, the radio didn’t come with a power adaptor, and she wanted one so it wouldn’t suck down batteries like mad.

So I did the all-American thing and hiked down to Radio Shack. “Howdy,” I said, reading from the instruction manual, “I’m looking for a AC power adaptor, 9 volts, 650 milliamps.”

“Okay,” the salesperson said, “Here’s a 9 volt, 800 milliamp adaptor that’ll do fine.”

“Waitasecond,” I replied, “I need 650 milliamps.”

“Oh, we don’t have that,” the salesguy said. “But 800 mA will be fine – if the adaptor is more than 200 mA over what your device recommends, then there’s a risk it’ll blow out the circuits. But this is just a 150 mA difference, so it’s okay.”

We tried it in the store, and nothing exploded, so I took it home and hooked everything up. However, although everything seems okay right now, I’m wary of taking technical advice from a salesperson (especially one who’s trying to close the deal).

So I ask the electrical geniuses at the SDMB; am I running a risk of blowing out the radio and/or starting a fire if I use this adaptor with my radio? Or is a +150 mA difference really within the bounds of safety? Note that my electrical expertise only goes as far as very basic wiring, and anything more complicated than installing a light fixture is beyond me.

I cant’ say for sure, but my impression is that the adaptor can’t force your mom’s CD player to accept more amps than it draws.

There are plenty of EE Dopers out there so hopefully one will show up soon. In the meantime you can settle for my sophomore circuit analysis course.

V = IR

where voltage = current * resistance.

If the power supply provides 9v, it won’t pump more in than your device draws (agree with previous poster). The amperage rating is probably the max current the adapter can provide. I’m not sure what would happen if the device tried to draw more, maybe the adapter would overload and burn up, or just not provide enough current.

Imagine that you had two 9v batteries. One is the little square kind with two terminals on top, and one about the size of your house. Hooking your radio up to the big one won’t hurt the radio. Voltage is analogous (very approximate analogy!) to how hard the current is pushing, and amperage is how fast it’s flowing. They are both pusing with the same force, so I don’t think that that the radio would be hurt by any 9v source regardless of its amperage rating.

As long as the voltage is right, more amps can’t be a problem. To low amperage is not a good thing. In addition to the device not working, it’s possible for damage to certain components (like motors) that try to run underpowered.

In fact, being a little over means that the voltage is more likely to remain stable.

A power supply’s current rating shows how much current is available at the rated voltage, should the load so demand it. A 12 volt, 1000 mA power supply will only deliver 1.2 mA to a 10Kohm load.

The next question might be why not just super-charge all power supplies so that they can furnish whatever amount of current will be required from any load? The reason is expense & size. The power transformer grows to monsterous proportions as the power supply is rated for more & more output current. My bench supply for 30V @ 30 amps weighs a bloody ton.

Change “from the load” to “by the load”.

Don’t sweat it - It will be just fine. Also as an alternative you could also buy 2 sets of rechargable Nimh batts and a charger. This way she can move it with her.

EE here.

He was wrong.

Voltage adapters come in 3 flavors:

DC unregulated
DC regulated
AC (unregulated)

Hopefully you purchased a DC regulated unit. (A DC unregulated unit might work O.K. But then again, it might not, and could possibly cause damage.)

A voltage adapter of the “DC regulated” type is a “constant voltage device.” This means it has one job: provide a constant DC voltage. And it will provide this constant DC voltage over a current range. The current is determined by the load resistance.

An example would be helpful.

Let’s say you purchase a 9V, 800 mA DC adapter (regulated). This means it will safely provide 9 VDC for any current from 0 mA to 800 mA:

If you do not have a load connected to it (infinite ohms), the voltage will be 9V and the current will be 0 mA.
If you connect a 900 ohm load, the voltage will be 9V and the current will be 10 mA.
If you connect a 180 ohm load, the voltage will be 9V and the current will be 50 mA.
If you connect a 90 ohm load, the voltage will be 9V and the current will be 100 mA.
If you connect a 23 ohm load, the voltage will be 9V and the current will be approx. 400 mA.
If you connect a 14 ohm load, the voltage will be 9V and the current will be approx. 640 mA.
If you connect a 11 ohm load, the voltage will be 9V and the current will be approx.800 mA.

Understand? Your adapter will provide 9 VDC, and supply whatever current the load needs (from an open-circuit all the way down to 11 ohms). The radio, of course, is the load. Based on the specs, it would appear the radio has about 14 ohms when operating in its most demanding mode (I would assume this means CD player on and loudness at full volume). According to the above calculations, you’re within the safe range.

Thanks for all the replies – it sounds like I have nothing to worry about, and I got some misleading (albeit overly cautious) advice from the salesdrone.

Do unregulated power supplies simply have more variation in output voltage, or does it have AC components that can cause damage? If it’s just the output variation, I don’t see how it can be a problem - voltage of alkaline batteries aren’t very stable either, and things made to run on alkaline batteries should be able to cope with the voltage variations.

Probably true, but…

On an unregulated supply, especially when the load is a relatively small portion of the rated output, the output voltage can rise above the nominal voltage. Battery output can vary quite a bit, but mostly down from the rated voltage, not above it. Most solid-state devices are more likely to be harmed by an over-voltage than by an under-voltage. They might not work with low voltage, but they’re usually not harmed.

Ugly

RJKUGly is correct, but I’ll add my 2-cents anyway.

You’re correct that batteries are “unregulated” voltage sources, but it has been my experience that your typical battery voltage is pretty flat over an impressive current range. This can be attributed to low source impedance, especially in C/D-sized batteries, and even more so in NiCAD batteries.

By contrast an unregulated adapter is usually more “risky,” because the output voltage can vary quite a bit with current (more than exhibited by batteries). A 9V unregulated adapter, for example, will typically have an open circuit voltage of around 13 VDC (!). As soon as you put a load on it, the voltage drops to around 11 V or so. It will be at 9 V when powering a “medium” load, and will drop to around 6 or 7 V (or even lower) when powering a heavy load.

So what’s all of this mean? Well, if you were to use an unregulated adapter, and turned the FM radio on at low volume, the adapter will be supplying 11 or 12 volts to the radio. This could put undo stress on some components. If you were to play a CD and turn the volume up, it msy mean too little voltage will be getting to the unit (6 or 7 V possibly).

To sum up:

Advantages of regulated adapters
flat voltage vs. current curve
safer to use on solid-state devices

Advantages of unregulated adapters
cheaper
adapter runs cooler
adapter is more reliable

It’s been my experience that the salespersons at Radio Shack never give correct answers to any question requiring actual knowledge of electronic theory. If they were competent electrical engineers, they woudn’t be working at Radio Shack.