Question About AC Adapters?

So, I wanna try amping up a device I own, on a rumor that it might respond better when one uses a slightly more powerful AC adapter. The current adapter has an output of 9DC 1.0 A. My assumption is that I need an adapter that is also 9DC but has a higher ampage, correct?

Second, what does mA stand for? Milli-amps? Am I right to assume that a 100 mA output is less than a 1.0 A output?

Yes, 100mA is less than 1.0amp. A milliamp is 0.001 amp. You may not fully understand current flow (which is measured in amps). Ohm’s Law defines the relationship between current and voltage as V=IR. Since the voltage (V) and resistance (R) are fixed, the current (I) draw is also fixed. Using an AC adapter with a greater current capacity is unlikely to “amp up” your device unless it was seriously under capacity to begin with.

Yes, mA stands for milliamps. One mA is 1/1000 of an ampere. You don’t say what sort of device you’re powering, but in general, using an AC adaptor of the same voltage will not make any difference in operation, provided the adaptor can provide the current the device will draw at the given voltage. If, for example, your device will draw 800 mA at 9 V, then whether you use a 2 A or 20 A or 200 A 9V power adaptor, the device will STILL draw 800 mA at 9 V. What might make a difference, depending on what sort of device you are powering and whether you are using an adaptor not intended specifically for it, is the regulation of the adaptor. Regulation is a measure of how accurate the output voltage will be at a given current draw. Really cheap adaptors often have poor regulation, and the output voltage will drop rapidly as the current draw increases towards the adaptor’s limit. So, if your device needs 800 mA at 9 V, it’s possible that at full load, the adaptor can only supply, say, 7 V. This would cause poor operation. Again, this is unlikely if you’re using the adaptor that came with the device.

The only thing I understand is that getting electrocuted by a commercial grade high-current outlet for a solid minute will, in fact, land me in the hospital and hurt like hell. Especially when it sizzles through my eyes and nose.

Some things may indeed work ‘better’ at a higher voltage. For example, your car’s tail lights normally run at 12-14 volts. They will be significatly brighter at, say, 20 volts. Of course, there’s a horrible trade off. Most things not only won’t run better at a higher voltage but will in fact be damaged. Most things that are any more sophisticated than a light bulb are A). going to be regulated internally which will do nothing except get hot or B). just get hot and burn.

Depends on the application really. In general, it’s a bad idea. Thing are desinged the way they are for a reason. If a manufacturer could improve perfomance by switching to a higher voltage supply (a cheap switch), they’d do it.

If this device has the option of running off batteries, the batteries should be able to power beyond the 1A that the adapter can. So you can see if it would perform better that way.

Yikes! Sorry to hear that. I bet there’s a great story behind it though. Any lasting effects?

Sometimes the reason has more to do with available power sources than with any physical limitation of the device. When the 3.6 volt Li batteries on my electric screwdriver finally gave up the ghost, I wired it up to a 5 volt, 3A DC power supply. It’s been running nicely on that for four years. The driver has more power than when it was running on batteries, plus I never have to wait for it to charge. That said, small DC motors are a bit of a special case, they contain minimal logic, sensor, or amplifier circuitry that might care what the voltage really is.