Cell Phone Re-charger Uses $$$ Electricity?

I only caught the end of the local TV news report, but the upswing is that a cell phone recharger uses the same amount of electricity when it is plugged into the wall, whether your cell phone is attached and being re-charged or not!

The reporter went on to say that this would be a good way to cut your elecrtric bill by unplugging the adapter when not charging your phone.

Now for me, the plug is behind my dresser so to unplug it every time I am finished charging the phone is difficult at best. But is that true? The electrical use is the same whether the phone is being charged or not? And if true, will this usage amount to much money over the course of a month?

No, it’s not true. What is true is that the typical wall transformer does draw some current, known as *magnetizingi], or excitation current. However, a connected load, such as charging battery, will always draw more power. Typically, a small wall transformer rated for an output of, say, 3-6 V with a rated load of 300-500 mA, which is typical for cell phone rapid chargers, will have a no-load magnetizing current on the order of a few tens of mA, depending on the design considerations of the transformer.

Surely the transformer will be driving voltage-stabilising circuitry, which will still be drawing a notable current (even if not as much as when charging)?

As a point of reference, my charger charging non-stop for a day costs me about nine pence.

For the electricity to be used, there must be a completed circuit. I HIGHLY doubt there is a completed circuit without the phone in there. If there were, there would not only be a costly power drain, but the thing would be hotter than needed, and would be more susceptible to power spikes. Then there’s the idea that it would route power away from phone when it’s plugged in.

So, without actually knowing for a fact, I’m going to say, "No’. There’s no reason one would be made that way.

If you’re referring to a voltage regulator, then no. A voltage regulator, such a 7805, consists of a voltage reference, which is usually a Zener diode in series with a high-value resistor; and a current amplifier. Some current is drawn at quiescence, but it is very small relative to the load current.

What about when the phone is fully charged, but still connected to the adapter? I would assume it is constantly drawing a current to keep itself fully charged. Would it be cheaper to unplug the phone when it’s done, let the battery run down some, then plug it back in as opposed to keeping it plugged in, say, overnight? How much of a difference are we talking? Nine pence like Gorillaman says?

If you feel the transformer and it’s warm, think of how much energy is wasted to keep it that way.

This depends on the charger circuitry. Simple, inexpensive chargers will continue to push a current through the battery, causing slow deterioration of the chemistry, ultimately leading to an inability to hold a charge. However, most chargers have circuitry to turn off or reduce the charging current when the battery is at full capacity. If your charger has an LED which turns from red to green when the charging cycle has finished, for instance, it has such circuitry built in. Unplugging the charger’s adaptor when the phone isn’t being charged may save some money, but it’ll be on the order of pennies a day, at best.

Ah.

I had forgotten about the idea of the box-plug being a transformer. I also didn’t know that they would bother to keep up a tiny current flow. Why bother? It obviously doesn’t need it to stay functional.

Anyway, apparently: Yes, a tiny bit.

I thought we were supposed to be fighting ignorance here?

All transformers/wall warts have a completed circuit 100% of the time. The primary circuit of the transformer provides a complete circuit between the two poles of the AC socket. The current consumed by this circuit varies depending on the load placed upon the secondary winding of the transformer, but it is never zero as long as it’s still plugged in. It takes energy to create a magnetic field 120 times a second.

As far as “hotter than needed” and “more susceptible to power spikes” goes, all I can say is… :confused:

It’s not that they go out of their way to design in a no-load current draw. Quite the opposite, actually. Transformer designers often take great pains to minimize this current, which exists because transformer cores are not perfect. Excitation current, or I[sub]x[/sub], is a core loss (called an iron loss in the magnetics industry). It can be minimized by using high-mu core materials, minimizing losses caused by eddy currents induced in the core (by using thin laminations or using an electrically insulating material, like soft ferrite) and by using as small a core as possible to provide the necessary power for the application. All things beign equal, large cores will have a higher I[sub]x[/sub] than a smaller one.