2.5 amp ac adapter & 2amp digital camera. OK to use or not?

I ahve just bought a satnav device with an ac adapter. This is 5v and 2.5 amps. I have a digital camera and wonder whether I can use this ac adapter with it.

The camera’s own ac adapter is 5v and 2.0 amps.

(Camera is Samsung Digimax V4 and the satnav is Navman iCN630.)

If both the satnav and camera use AC to DC adapters (some external adapters are AC/AC with any DC rectifier and filter contained in the device rather than the “wall wart”) and IF they have the same quality of filtering, then a higher current (amperage rating) on the adapter probably won’t affect the lower current device. Current ratings are capacities, but actual current draw is generally limited by the input impedance of the device’s power jack.

There are many possible obscure variables, but in 30 years of swapping external power supplies, I’ve rarely had a problem unless the adapter was defective. YMMV, it’s your gear at risk (It doesn’t sound like you have an oscilloscope to check the voltage ripple)

I’d definitely check the polarity even if I felt like taking a chance (as I typically do). You didn’t specify the style of connector (I asumme they are both the same) but plugging, say, a center positive into a center negative can be bad. I’ve always felt manufacturers should protect their power inputs with diodes (it’d only cost a penny or two) but they generally don’t. I guess they don’t want to waste even a penny on protecting “idiots” (no offense to you: I’d consider myself an ‘idiot’ if I reversed polarity, and I’m sure product designers would class me as such if I did. It’s a matter of outlook: “I gave you an adappter, and you plugged somthing else in without checking?”

How about the polarization of the plug? What is the current draw of the camera?

Assuming polarity and plug size match it can probably be used without problem, however, if the camera uses the adapter to power and charge internal rechargeable batteries, as many do, the correct volt/amp matching might be a lot more critical, as rechargable batteries are often not real tolerant of being over or under changed.

The adapter’s current rating is the maximum current that it can supply without overheating. That is, it can deliver any current up to 2.5 Amp. The current out of the adapter is a function of the camera’s demand, not what the adapter is capable of delivering. Your current adapter is 5 V., 2 A. If you camera uses, say, 1.6 A on that adapter, it will also use 1.6 A on the new one since both operate at 5 V output.

Well yes that’s the way it should work, and I am not an EE maven, and can only speak of personal experience here, but (as an an Ex- Radio Shack Manager in the early 80’s) I have seen units that recharged batteries off the adapters destroyed, by using properly voltage and polarity mated adapters that delivered far more amperage than the units usual power unit delivered. Why this happened I don’t know, but the batteries fried inside the unit. The unit worked fine when powered without batteries off the adapter directly. Maybe units are more resilent now.

Well if anyone has an AC camera adapter 5 V. any ampere rating whatever over 2 A. that they are afraid to use because it “supplies too much current” they can send them to me. I could use one right now.

My email is in the profile and I’ll send my address.

Yes, but as a former Radio Shack manager myself, I am convinced that occurences like this are the result of user failure alone. The AC adaptors that Radio Shack sells require the user to install the correct sized tip, oriented for proper polarity. Get the polarity wrong, and you can kiss your device goodbye. When I sold these things I made a point of installing the tip for the customer and educating them on the correct way to replace it, should the need arise. As David Simmons correctly points out, the adaptor will not push more current through a circuit than it demands. If you have something that needs 6 VDC @ 1.5 A, you can use a 6 VDC adaptor that can supply 1000 A, and it will still use only 1.5 A.

One additional note.

The regulation in these “wall wart” type of adapters is usually pretty poor. You will want to match up the current rating fairly closely. In an ideal voltage source, the 5 volt output will be 5 volts whether it is putting out 0.5 amps or 2 amps. In reality, you may find that the voltage goes up to something like 7 or 8 volts if the current is fairly low. The OP is talking about a difference between 2 and 2.5 amps which is close enough not to worry about, but if I were going to use a device that drew only 100 mA then I’d be hesitant to use that particular adapter unless I measured the voltage first and made sure it was close to what it was supposed to be at no load.

Also, since no one else mentioned it, make sure that you match the output type, i.e. AC vs DC. There are adapters that output AC (they are really just a transformer) and have the rectification, etc. inside the device.

Summar: Match the physical plug, polarity, output voltage (including AC vs DC), and make sure that the adapter can supply at least as much current (or more) than the device draws.

The problem with blowing batteries with the wrong charger is that when run down, the batteries are a lot less resistance and a lot of the systems do not have an internal limiting device on the amperage because the supplied charger one will only do 2.25 A or whatever and so the batteries can not be charged too quickly. If you give that system a source that can deliver 10 A then the batteries will allow too much current through and then you get batteries that are getting to hot from an incorrect charge rate and you get a mess. If you are just powering a device that has a more or less fixed current draw or a maximum that is not dependant on supply capabilities, than larger capacities are fine. Polarity and all that stuff not withstanding…

engineer_comp_geek’s post is very important. Most wallwarts are Class 2 adapters. Very poorly regulated. They are designed to provide the given voltage only for the device they are designed for.

I run some basic tests on a substitute adapter before giving it a live run, and check the voltage during the run.

More specifically, they are designed to provide the rated voltage at the stated current draw.

Yes, that’s part of what I meant by ‘filtering’. I couldn’t think of one term to indicate DC voltage ripple, voltage sag under load, etc. Back when building your own quickie power supply was pretty common, we lumped a lot of different effects under the cloppy catchall “the filtering effect of the load”, but in retrospect, that even if there were a single accurate technical term, I should have spelled out the effects I meant.

BTW, I did mention plug polarity and AC/DC output, as raised by others