Just how inefficient are inductive (magnetic) chargers ?

I have tried googling for some ballpark numbers but have not found any.

So say, I have to fully charge a 3000 mAh battery from zero charge. In the first instance I use a wired charger which consumes (3000 + X ) mAh and loses X mAh as heat. In the second instance, I use an induction charger which consumes (3000 +Y) mAH losing Y mAh as heat. What are some ballpark numbers for X and Y ?

The question is wrt chargers for modern cell phones .

Some discussion in this thread: How do cordless charging stations work?

https://boards.straightdope.com/sdmb/showthread.php?t=888398

If any errant dipole gets realigned to the magnetic field created by the charger, that would also be a loss of energy of the system aside from that due to heat which you mentioned.

And then a fraction of a second later, that errant dipole would get realigned the other way, and so on. That’s how the energy is lost to heat, not an energy loss other than heat.

Rough ballpark numbers, you are going to lose about 25 to 30 percent of the energy due to inefficiency in the inductive energy transfer.

Thanks for the answers. Especially engineer_comp_geek - I trust your numbers and please post any cite you may have.

I just did some measurements using an inductive charger. In 24 minutes I charged my phone 307mAh using 634mAh of electricity.

I used Accubattery on Android to measure the charge received on the phone, and I used a USB power meter to measure how much electricity was used. The screen was off the entire time, but the phone and radios were on, doing whatever they do in the background. So, some of the 634mAh were lost to powering the phone, and not to heat. The power meter showed that the draw peaked at 1.69mA at about 4.99 volts. Accubattery is reporting the change in state of the battery, so it is only as accurate as the phone’s battery management system allows.

I think it is safe to say that the inductive charging is only around 50% efficient.

For comparison, using the same USB power meter, I plugged in the phone with a cable for 8 minutes and charged the battery 125mAh using 150mAh of electricity. The draw on the meter was about 1.20mA at about 4.90 volts. So about 80% efficient. That also lets us know that a maximum of about 15% of the mAh registered on the power meter could have gone to running the phone.

Lots of “abouts” because the numbers do bounce around quite a bit. The mA can move a considerable amount, from maybe 0.60 to 1.20. The voltage will sit at 5 or 5.01, but as the amperage goes up it will drop a small amount.

If I did my math right, the whole exercise cost me $0.00056 at $0.14/kWh, so even at 50% efficiency, it’s not like the waste from an inductive phone charger is going to add up to much over time.

I have an old, slightly worn USB powerbank. I closely follow its performance (charging and discharging Watt-Hours) with a little USB power meter.

It reproducibly takes 23Wh at 5V to charge from completely empty to full. It gives 13Wh at 5V to discharge back to completely empty.

I estimate its capacity at 16Wh (it’s rated at 19Wh but it’s old and worn, plus that rating is extremely inaccurate in any case).

So to charge it takes 23Wh where 7Wh is lost to heat (70% efficient).

I suspect the “charger” (charging circuit in the power bank) is just a linear regulator, while the discharging circuit is of course a boost converter.

If the charger were a buck converter it could perhaps be a little more efficient. I think the efficiency of these very small converters is rather poor in any case, in percentage terms.

I wonder if an inductive charging system could regulate the voltage and current at the same time as it controls the energy transmission process across the coils, and thereby eliminate an extra buck/boost converter on the output side. That should make up for some of the loss compared to a straight wire.

Two problems. First, mAh isn’t a unit of energy, but of charge. You need to multiply it by a voltage to get an energy, and the two voltages were not necessarily the same. The relevant voltage for your USB power meter would presumably be the 5 volts that’s standard for USB, but the relevant voltage for your phone would be the battery’s voltage, which may or may not be 5 volts.

Second, the energy used to power the phone is energy lost to heat. It did something else useful along the way, but it still ended up as heat.

Thank you for doing the experiment. Chronos’s comment on the mAh is correct but I was looking for ballpark numbers and approximating the voltage to be constant is okay with me.

Agree that individually it’s not a big cost but it is unnecessarily adding to CO2 emissions

But if you add it up over all 7 billion people on the planet… it’s still not a big cost. There are hundreds, probably thousands, of other things you could do that would save more energy.

And a ballpark estimate of 50% is pretty useless, since efficiencies of 100% or 25% (which would be very different from each other) are also in that ballpark.

I agree with you. Still it will be nice to know official figures on how much more energy is wasted using induction charging. The current models, reduce the charge rate to 50% (that is it takes twice as long to charge compared to wired ) presumably (?) to hide the heat loss if it were to charge at the wired charge rate.

Moreover, devices keeps adding up and they all have more powerful batteries : phones, tablets, watches, headsets … and maybe we don’t have to worry about the inefficiency today, but definitely something to watch out.

You’re right, I’m too used to thinking about kWh. The battery in the phone is 3.85 volts, so 1.2Wh of battery charge from 3.4Wh of USB power. I guess I could have plugged the USB charger into a kill-a-watt to get wall power used.

Yes, entropy, heat death, etc., but if you’re going to take that viewpoint, then any charging method is almost 0% efficient, because all that energy just ends up as heat eventually.