For Mother’s Day, I got my wife a little CD player/radio thingie, so she can have some tunes while she’s working in the kitchen. Unfortunately, the radio didn’t come with a power adaptor, and she wanted one so it wouldn’t suck down batteries like mad.
So I did the all-American thing and hiked down to Radio Shack. “Howdy,” I said, reading from the instruction manual, “I’m looking for a AC power adaptor, 9 volts, 650 milliamps.”
“Okay,” the salesperson said, “Here’s a 9 volt, 800 milliamp adaptor that’ll do fine.”
“Waitasecond,” I replied, “I need 650 milliamps.”
“Oh, we don’t have that,” the salesguy said. “But 800 mA will be fine – if the adaptor is more than 200 mA over what your device recommends, then there’s a risk it’ll blow out the circuits. But this is just a 150 mA difference, so it’s okay.”
We tried it in the store, and nothing exploded, so I took it home and hooked everything up. However, although everything seems okay right now, I’m wary of taking technical advice from a salesperson (especially one who’s trying to close the deal).
So I ask the electrical geniuses at the SDMB; am I running a risk of blowing out the radio and/or starting a fire if I use this adaptor with my radio? Or is a +150 mA difference really within the bounds of safety? Note that my electrical expertise only goes as far as very basic wiring, and anything more complicated than installing a light fixture is beyond me.