The battery capacity for electronic devices is almost always listed in amp hours/milliamp hours. A common range for a phone battery is something like 2500-4000mah (2.5-4 ah)
But amp-hours is specific to the voltage the device operates at. If you needed to compare capacity to something that operated at a different voltage, you have to convert it to watt hours.
But that seems like an unnecessary extra step. Since watt hours also tells us what the energy capacity of a battery is, and allows apples to apples comparisons, why not just use watt hours? What’s the advantage of using amp-hours?
Example of where this can be confusing: if you’re buying a USB power bank, the battery capacity might be might be 10,000 MAH. So you think okay, my phone is about 3300 MAH, the battery can charge my phone 3 times (ignoring charging losses for simplicity sake). If I understand correctly, this is not correct, because while the batteries of the power bank may have 10,000 MAH at 3.7, they only have around 7400 MAH at the 5V usb charging voltage, so you’d get more like 2 charges. If you have a quick charge protocol that runs at a higher voltage, even less. I may have this wrong as I don’t understand electricity that well.
Or a more dramatic comparison, I have drone batteries that hold 3500mah, but run at 12v. So you might think from 3500mah they hold the same amount of power as a cell phone, but that’s actually 42 watt-hours for the drone vs 13 watt-hours for the cell phone. You might make the mistake of thinking a 10,000 MAH power bank could charge your drone battery almost 3 times, but it wouldn’t quite reach charging it once.
But if it was all listed in watt-hours, you could make an apples to apples comparison. You’d know how many times a power bank could charge your devices and how much energy the batteries really held regardless of their operating voltage. So why not use that as a metric?
I suppose the point is - when you are discussing a battery, it has a fixed voltage specific to that battery. The only variables are amps and time. (Within certain limits - you can’t dump the entire contents in a few seconds - usually)
Yes, but it’s not always clear what that specific voltage is. If I’m dealing with a portable phone charger, I might not even know, and probably don’t care, what the internal battery voltage is. I do care and might know that the USB charging output is 5 V, but that’s probably not the voltage used for the amp-hour figure.
I think generally batteries can be thought of as a fixed voltage source in series with a resistance. The resistance is low when the battery is fully charged, and gradually increases as the battery gets depleted. I think I read this in a discussion of single use batteries (like the disposable AA cells).
This changing resistance makes the voltage uncertain as the battery runs out. It might be more accurate to measure the charge (or current x time) the battery can deliver, as it might be less dependent on the current you choose to draw, or, equivalently, the impedance of the load you choose to put on it.
Everything can be considered as this. This is known as the Thevenin equivalent. (The dual form is the Norton equivalent, which is a current source in parallel with a resistance.)
Amp hours dates back way before we had such niceties as tiny integrated switch mode power converters. Back then real men toted lead acid cells full of Amp-hours. Most useful chemical cells have a quite constant voltage with only a sudden drop off (effectively a sudden increase in internal resistance) when they discharge. So amp-hours, or state of charge generally made sense. Note that the word charge is used as a measure of battery capacity. This is not a coincidence. When you charge a cell, the final metric of how full it is, is not how much energy you pushed into the cell - as it will inevitably have lost some of that into heat - but the charge you pushed in.
There is a big misunderstanding here that needs to be addressed first
Batteries are not like generators; their output current drops as they are discharged, and they have limits on how much current can be withdrawn and over what time.
For Normal / Regular batteries, the mAh or the Amp-hour is the 20 hour amp-hour rating. So a 20,000 mAh battery will delivery 1000 mAH over a 20 hour period, when initially fully charged and it will lose 80% of its charge by the end of 20 hours.
Depending on the type of battery and quality control by the manufacturer, a 1000 mAh battery (20 hr rating) may have the following characteristics :
1000 mAh if discharged over 20 hrs
800 mAh if discharged over 8 hours
600 mAh if discharged over 6 hours
200 mAh if discharged over 1 hour
Some batteries like a fork lift battery are rated for 6 hours. The Ah of the battery means the Ah delivered over 6 hours when the battery goes from fully charged to 80% discharged.
Some batteries like a gasoline car battery are not rated in Ah but CCA (cold cranking amperes). These are the amps the battery will put out when starting a car (high amps for a few seconds).
This is an excellent response and the only thing I have to add is that you will never get the ‘full’ amount of energy out a a battery because the electrolyte and or electrodes will start to break down first (in many cases, creating an extremely volatile combustion hazard), so stating a theoretical energy capacity is not really useful. Because all electrical applications require some minimum current delivered that becomes the threshold, and because sophisticated modern battery power systems are intended to deliver some baseline power draw at a regulated voltage stating the capacity in amp-hours makes perfect sense. It is possible to regulate or limit current draw but not generally done except for safety reasons (i.e. GFCI outlets to prevent electrocution). The bigger problem in terms of battery integrity is creating a circuit with too little current draw, as this will ‘short’ the battery and often cause detrimental overheating, thus why many systems have a resistor in line with any function to assure a minimum loading.
Automotive ‘starter’ batteries are non-regulated and aren’t intended to deliver any specific amount of power over a duration; instead, they are designed to deliver sufficient ‘cranking amps’ for a very short duration and then to be immediately recharged to their full capacity by the vehicle alternator, so unless you are using a car battery for some other purpose such as running the lights with the engine off (which you should not do) you don’t really care about its capacity, just its ability to provide the peak amperage over the short duration of engine ignition.
For those of us who are not well versed in the technicalities of battery use and design, I think that it’s more useful to think of the Amp/hour rate as an arbitrary number which gives a good metric for the capacity.
It’s easy to understand that a 120 A/H battery is “bigger” than a 70 or 50. The same can be said for milliamp/hours.
Except that the OP’s point is that it’s not a good metric for the capacity or size. If I take the same collection of cells of the same size in the same container, containing the same total energy, but hook them up in parallel in one case and in series in the other, the parallel hookup will have more amp-hours than the series one. This is especially relevant nowadays, when cheap, compact electronics can easily change the output voltage up or down: If I take one battery pack in series and one battery pack in parallel, and transform the output of both to 5V, even though the two battery packs will have very different amp-hour ratings, they’ll have similar watt-hour ratings, and I’ll be able to use my phone for about the same amount of time powered by either. The watt-hours are telling me useful information; the amp-hours are not.
For battery packs used as rechargers (e.g. a phone bank) It gets more complex because the deliverable output of the box is affected by the efficiency of not only its output electronics, but the efficiency of the input electronics of the device being charged.
Multiplied by all the voodoo of the various battery conditioning schemes, USB charge rate signally schemes, etc. This is very, very far from your grandfather’s two wires connecting a battery and a resistive load.
It would be nice if the standards setting folks and the manufacturers could come up with a single comparable results-oriented metric. But they haven’t, and arguably can’t.
A side risk of trying to establish a unit of practical device-charging capability is that it quickly leads to obsolescence as the tech moves. Such as the stupidity we have now where LED lightbulbs are rated in “equivalent electrical consumption watts of traditional incandescent lightbulbs of a certain quality made using 1970s tech.”
How about we abandon “watts” for lightbulbs, except for fixture heat-absorbing capacity purposes, and go with Lumens? You know: the unit of light output?
Every LED lightbulb I have ever seen has listed the lumens on the package. However, most people do not know how many lumens a 60-watt incandescent produces, so a comparison there is useful to the consumer.
Yeah. I understand there’s a transition going on. So lumens are appearing on all packages, even incandescents. In addition to 1970-watt equivalents.
My point / peeve, which may not have been as on-point as I hoped it would be, is that trying to get a single practical “How much stuff will this battery charge?” number may be a fools’ errand.
But amp-hours, like rating bulbs in watts, is an example that was engineering accurate when decided upon, even if it was consumer useless, or at least consumer semi-misleading info. A 100W bulb was never twice as bright as a 50W bulb. And different bulbs of the same wattage even of the same technology put out different amounts of light depending on other factors.
And trying to invent a new battery charging capability number runs the risk, like watts for CFLs & LEDs, of ossifying a rating technique to an increasingly obsolete technology.
Voltage is a variable. Resistance is a variable. But an Amp is always an Amp. One Coulomb per second. A battery changes voltage as it discharges. So Amps is the most accurate way to quantify it.
I use a couple of commercial battery testers and a 72 channel battery tester that I designed and built.
The core unit in the equations that the testers use is Amps. I constantly measure the Amps at intervals derived from the changing voltage and set resistance. From the result I calculate the Watts as well. I get a Watt Hour and Amp Hour result.
You can also consider a 12 Volt DC/DC convertor hooked to a 12 volt 10 Amp Hour battery. As the battery discharges the convertor will draw more Amps per hour to keep a 12 Volt output to a steady load. But ultimately it will draw 10 Amp Hours at 12 Volts with allowance for inefficiency.
This is exactly why watt-hours (or kilojoules) makes more sense. I don’t care about the amps or the volts at all, as long as they stay within a reasonable range. I care about watts, and allow the converter to pull as many amps as required given the power demand and the current battery voltage (including however much the voltage was depressed by the current draw).
The answer to the OP is the same as anything else where a measurement seems kinda stupid: it’s historical. At one point in the past, it made sense, and you trained a generation or three of people to think that way, and they built all their equipment so that was the favored measurement. There’s not quite enough motivation to change it, and a lot of inertia against changing it, and therefore it doesn’t change.
EVs were able to make a clean break since the cell architecture is never really exposed. So they logically used kWh instead.
I have to agree, that for a consumer looking at power banks and similar things, Watt Hours would be nice, if there was a regulatory law as to how that is calculated. Best case is measured at the output of the device into a set load.
For a battery itself, Amp Hours is a good measure. As one may use the battery in a variety of ways. Good raw starting point for calculations.
For now, I look at Amp Hour for such devices. Because the batteries in them have fairly similar specs. Unfortunately that does not take into account how cheap and crappy the voltage conversion system might be.
So I agree. Watt Hour ratings would be best. With the caveat of proper regulation.