My cell phone’s official (re: name brand) charger has an output of 5V and 1000ma. My old cell phone had a charger with the same charging port (micro USB) and it has a 5.7V, 700ma output. So I was wanting to use my old phone’s charger as a back up to keep at work.
Aside from the fact that it will charge slower since it is 700ma, will a 5.7V charger on a phone that officially uses a 5V charger cause problems?
Using a 700 mA charger instead of a 1000 mA charger will likely not just charge slower - the phone doesn’t know the difference between the capacities and will try to draw 1000 mA (or perhaps there is some overhead and it only needs 700 mA or less); charging, including charge current, is controlled inside the phone itself, not by the charger, which is just an AC/DC power supply. Worst case, the underpowered charger will fail, or since they are (almost always now) SMPS designs, shut down from overload, resulting in no charging (and many overloaded SMPSs go into an auto-restart loop, which isn’t that good for the phone/internal charger since it will constantly start up, then shut down; cheap designs may have no protection at all).
Other than that, I doubt that 5.7v instead of 5v would make a difference; remember that the charging process is internally controlled; the sole function of the charger is to provide an appropriate voltage at sufficient current, you could use any old AC/DC (or in a car, 12-5 v) adapter, provided it can supply enough current. The only potential problem is if the phone used a linear regulator based charging circuit, since power dissipation in these is dependent on the difference in input and output voltage, but I doubt 0.7 volts would be enough to cause overheating (many chargers use a SMPS-based design for the same reasons the “charger” adapter does).
For a while, at my old company, we actually designed in a mechanism for detecting which charger was attached…so the phone could, in fact, know the difference.
In terms of using 5.7V instead of 5V - I’m actually surprised you have something that uses a microUSB but doesn’t provide 5V. Whether it’s safe or not is very likely to be phone implementation dependent. You might be within tolerance of the charge circuit. Or you might not.
ETA: If the phone handles it properly, it may take advantage of the increased current it could provide after regulating down to the battery charge voltage. A 5.7V source provides more power than a 5V source, so the lower max current may not matter.
I didn’t think of this (the last phone charger I took apart was just a simple AC/DC power supply, nothing to enable detection of anything), but I’d think the chargers would have to be standardized for this to work (meaning that you could use a charger from a different phone manufacturer).
5.7 volts at 700 mA is about 4 watts, which is still less than 5 volts at 1 amp (5 watts), which might be enough to make a difference, depending on the actual power draw of the phone; if it only uses 4 watts, that would work, assuming a switching-type charging circuit, not sure how much margin they put into the power supplies* but if you know how much power is needed, there isn’t any reason to provide extra margin, unlike, say, a computer PSU.
*Of course, it can go the other way around; I recently found an external drive enclosure that had a 25 watt power supply and I drew nearly 50 watts from it as a test before it shut down, but it got very hot, certainly too hot if it was still inside the enclosure, and reducing reliability either way.
So if a charger is rated for more milliamps than the device requires it will only take the amps it needs (a 1200ma charger and a device that requires 900ma), but if it is rated for less it will try to pull more current and damage the charger?
I’m annoyed that a device with a standard connector like MicroUSB would use a non-standard voltage. I think your phone will be fine but putting 5.7v on the label is just confusing.
Some microUSB power input devices like the Nexus tablet will NOT charge period if they do not detect a high enough amperage capacity. I discovered this the hard way by leavig my notebook at home and trying to charge the Nexus tablet on my microUSB car cellphone charger. It did not even recognize it.
Ok…if it’s just letting you charge as an AC charge over USB, then all you’re doing is violating USB spec. Which isn’t a big deal in many cases.
Laptops, especially when running on battery, probably won’t like it, and you could potentially damage the USB port on a poorly designed laptop.
Desktop computers tend to connect the 5V USB supply to a high capacity output on the power supply, so you can massively overpull and not have a problem. If you’re in that situation, you should be completely safe with no damage to either PC or phone.
While this is true, they usually have current limiting in the form of a PTC fuse, which will limit current if you try to draw too much (usually a good margin though, 1-2 amps); all modern motherboards and USB adapters that I have seen include them although some older stuff may not have any protection (I’ve even seen a few that use regular fuses, which will permanently blow if overloaded and kill the port unless you know how to replace them).
But what about charging from a USB port on a PC? My understanding is USB ports are only rated for either 200ma or 500ma. My phone requires 1000ma. Isn’t the phone just going to recognize it can’t get 1000ma and just charge slower w/o any of the hardware involved being damaged? that is what it seems to do when I charge my phone from the computer.
The simple version is that your USB port provides 100mA until the phone asks if it can take 500mA. Most PC will grant the 500mA request. Some phones will refuse to charge if they are stuck with only 100mA.
Your phone knows it is using USB and can only draw 500mA, and should charge accordingly.
Yes, all the way up to “and damage the charger”, which might happen but will not necessarily happen. Some chargers and power supplies are built so they cannot be damaged no matter how much current the load tries to draw. A “short circuit” is an example of a load that tries to draw an infinite amount of current, and many power supplies are designed to handle this without damage. Chargers and power supplies that do not have this capability still do not self destruct the moment their rating is exceeded. There would be some margin of safety (and imprecision). 1200 is 33% more than 900, so I don’t know if there is a 33% margin.
The load drawing too much current is one failure mechanism. The power supply pushing with too much voltage is another. You have both going on.
Working on the basis of the specs (and noting that actual values may be better), your official charger has an impedance of 5/1 or 5 ohms (or more accurately claims to be able to handle a 5 ohm load). Your old charger has an impedance of 5.7/0.7 or 8.14 ohms. Generally it’s preferred to put a higher impedance load on a lower impedance power supply. Putting a lower impedance load on a higher impedance power supply spends most of the energy in the power supply itself, as heat.
So I can think of three reasons it might damage something. It sounds close, and I don’t know, maybe the probability of it being just fine are 80% short term and 50% as a regular practice.
Basically the USB spec talks about “units” of 100mA.
By default, a 2.0 host must provide one unit - 100mA. After the slave asks and the host gives permission, it can provide up to 5 units.
Yes, but there are new charging standards for USB. (See the wiki article on USB, power section) Those devices with Y USB cables - especially harddisks, but also optical drives and monitors - if they work with only 1 port plugged in, that port should be supplying more than 500 mA. Also if your iPad charges, as opposed to “not charging”. Lastly, since Europe now uses micro USB to charge all phones, there must be some way of supplying >500 mA over USB.