Laptop computers and power sources

Can anyone tell me why different laptop computers have A/C Adapters that provide different amounts of amps?

And would it be more likely to damage the computer or the adapter to mix and match willy-nilly?

Would it make a difference if the intent was to use the adapter to charge the battery and then use the battery to run the computer as opposed to plugging in the adapter and running the computer?

In case you can’t guess–I’ve got a laptop with a decent battery that has no charge at present, and a dead adapter. I will probably be replacing the laptop, but I’d love to charge it up one last time and get some stuff valuable only to me off it first.

Adapters are cheap on Amazon, most are $8-$10 plus shipping.

Rather than trying to jury rig the charger adapter, remove the hard drive and buy a USB adapter to read it from your new computer. The adapters are very inexpensive and you will have an extra hard drive to use also.

Assuming that voltage and polarity is the same, you can use a psu with a higher amp rating on a laptop that requires lower amps.

The amp rating is the maximum current that the psu can deliver for prolonged periods of time. But the amperes that are actually drawn depends entirely on the device. Say I have a laptop that draws 3 amps and connect it to a psu that can deliver 5 amps max. The laptop will still draw 3 amps.

In general, you can use any adaptor that:
Is the right voltage (as long as it’s regulated)
Has the right connector, with the right pin polarity
Has the same current rating, or higher (the device should only draw the current it needs)

In practice, you may be able to get away with an adaptor that meets the first two requirements and is a little under on current, as they are probably rated on the maximum possible draw (i.e. charging the battery at the same time as writing a DVD, with the screen on full brightness and something else imposing serious demands on the GPU, etc) and most power supplies are probably a bit overspecified anyway.

yes if the voltage and plug is the right size and plus/minus, then having the current (A) a little low may be alright. though i would just let it charge the battery and run the laptop only on battery with the power supply disconnected.

I used an adaptor that was rated at 4.18 amps to replace one that was rated at 4.5 amps. After a year or two, the adaptor stopped working. It obviously didn’t hurt the computer.

The reason they have different amount of amps is bigger laptops need more power, and it’d waste money to give everyone the most powerful adapter.

I could charge my MacBook Pro with a 45W adapter (it uses 85W). The shop said it’d just charge slower. I think it depends on the adapter, good ones will limit current to their max rating.

For the record, I have in fact ordered a new adapter, and will be buying a new computer as soon as I figure out what sort of computer I really want.

But I do appreciate the information people have provided.

That may not work since adapters are based on SMPS technology and most SMPSs limit current by shutting down and restarting every second or so when overloaded, with little net output power (some will just shut down until you cycle input power, which results in no charging at all). It does depend though on whether you are using the laptop at the same time; if it uses the same amount of power to charge the battery as to run the laptop itself (42.5 W each for 85 W total), then it would be fine, if close to the maximum rating, but a good design should be able to continuously supply its full rated output (at least for desktop PSUs, many cheap power supplies are overrated with the peak power as if it were the continuous power, or as if all outputs can be fully loaded at the same time).

It not just a waste of money it is also much more convenient to carry the smallest adapter possible. I don’t want to lug around a big power adapter for my light and slim ultrabook.

Ah ok, in that case I think my laptop was just using <45W, and only uses 85W at the peak.