Can any charger charge any device these days?

I see this was asked before here but over 10 years ago and things have changed so, asking again.

It has become increasingly common for cell phones to not ship with a charger so you have to get your own.

But, not all chargers are the same. They can have different wattages they can output. Generally a higher wattage means faster charging. But, then, not all phones are able to accept those higher wattages.

So, if I plug a phone that accepts 15 watts into a 30 watt charger will the phone or charger blow-up/melt? Or will they talk to each other and settle on a wattage both are happy with? Or, 15-watt phone just will not accept the 30-watt charger. Or something else? Does wireless charging change anything versus plugged in using a cord?

Of course, this does not have to apply to phones only but, for this question, that is the main concern.

There is some configuration going on between your phone and those USB chargers. They talk to each other and negotiate power delivery. So a standard-conforming 200W charger is not going to crank up the voltage and fry your phone if the latter only pulls 15 W; on the contrary, a more powerful charger may be better in case you want to plug a laptop or other power-hungry device into it.

That was me that asked that question 10 years ago.

I’ve never had a problem with different chargers. Of course iPhone uses a different plug. And there are different USB plugs now.

I also bought a few rechargable batteries as external chargers that are real handy to have especially for travel. And most hotels now have USB charging ports on their lamps. I have a lamp like that too. And then there’s the charging ports in cars or just the cigarette lighter port in cars that you plug an adaptor into.

I’ve used them all for my and my Wifes Samsung android phones with no issues.

I changed a receptacle in each bedroom. I get 2 standard AC and 2 usb charge ports.

I keep usb-c, micro, and kindle cables in each room.

It’s a easy upgrade for anyone that’s familiar with basic electrical work. Turn off the breaker first. It takes about 15 minutes for each receptacle.

As I understand - the problem years ago was that every device had its own charger. Obviously they all had the 110V (240V in Europe?) plug, but the supply end could be any weird shape and the voltage might vary from 3.5V DC to 12V. (Laptops usually used anything from 12V to 21V DC). Break the charger (say, wreck the fragile cable) and the device was useless unless you could find the matching voltage and plug shape. There was no standard. Either the replacement cost a fortune or you simply couldn’t find one. There was no uarantee your older phone’s charger would fit.

The EU decided to simplify things by requiring that all phones use USB to charge, and of course, Apple had its own socket but USB at the other end. The result we see today is only either USB micro or Apple socket, and USB or USB-C at th other end. Cables are easy to find.

As others mention - USB negotiates power levels between source and device, unless it provides the very basic amount of power. (The newer iterations of USB can provide much more power if the device asks for it - but only if the device asks.) The USB standards are established.
You will not fry your device unless someone makes a USB socket that really does not conform to USB standards. (I have yet to hear of any) An added advantage is that this is also the standard now for a number of other electronic devices piggybacking on the same idea.

The cheap $5 ones are usually pretty terrible. Even those won’t usually fry a device, but that’s more due to the robustness of modern devices. And some of them are only one very small malfunction away from frying not only the device, but the person who plugs them in.

And even before the smart power negotiations in the USB standard, there wasn’t a problem with using a high-current charger with a low-current device, so long as the voltages matched. A low-power device simply won’t draw as much current, inherently, with no negotiation needed to make that happen.

I’ve been buying chargers like this, but I don’t know if the wattage is sufficient for some “power users”.

Usually the issue is more that the $5 ones hardly put out any wattage at all (usually just the bare minimum USB standard), and won’t charge a phone or other device in any reasonable length of time.

But now, anything that uses USB can be charged by any other USB charger, like others have said- they’ll just negotiate up to the maximum that both devices can handle.

So, in a nutshell, are most smartphones designed to filter out excess power so that it won’t ever suck in more power (for recharging) than it needs, hence no danger to the battery of being fried?

USB is standardized at 5V. The current draw varies by device. USB 3 defines a unit load as 150 mA. High-power devices can draw at most, 6 unit loads (6x150mA= 900 mA). That’s .9 amps.

P=IE

P=5X.9
P=4.5W

That’s why a 5W charger is recommended for USB 3. Wikipedia has a chart for USB 1 and 2.

Power draw depends on the device. A clock radio might use 0.1A. A toaster easily draws 2A. They both use 120V.

That’s not something you need to design for. It just happens, unless you go to considerable lengths to design a device for it to not happen for some reason.

They are going way past 5W these days. IIRC Apple’s iPhone is 12W or 15W (I forget) and Google is working on 30W.

How can you tell? And what’s the cost to manufacture a safe and decent-quality one, below which you’d be suspicious?

USB C-is capable of higher wattages, the new 4.0 standard can actually handle 48V @ 5A.

Of course your device needs to be designed to use that power. USB-C is actually capable of 20V/5A = 100W. My laptop can charge through its USB-C port. To achieve the higher power levels higher voltages are used. When a device is plugged into a USB-C port the port only supplies 5V. If the device is USB-C it will communicate with the charging device to tell it what voltage and current levels it is capable of handling. The charger will then change its output voltage to the negotiated level. That is how frying older 5V only device is prevented.

USC-C is also capable of bidirectional charging. I can charge a device connected to the same USB-C port that I use to charge my laptop. The direction of charging is determined by the same negotiation that decides the power level.

I don’t remember the link, but I once saw a blog of a guy who tested a bunch of them. The bad ones can have things like a sine wave varying between 0 and 10 volts, instead of the constant 5 volts they’re supposed to, or a dropoff in voltage as the current rises. He also did teardowns, and found that some of them had woefully inadequate insulation between the 120V AC input and the 5V DC output, meaning that a very small and plausible fault could lead to the output being at 120V.

It’s not so much a matter of the price being high enough to trust (after all, one of the shoddy manufacturers could easily stick a higher price tag on one) as of going with a reputable manufacturer.

This is the key - think of a device as a resistor. (Oh, wait, it basically is). For a given voltage - i.e. the standard 5V of USB - the device itself only lets X (milli)amps through. No different than, as mentioned, the clock radio or the toaster. Of course, feed your toaster or clock radio 6000 volts, and you’d better call the fire department. Amps x Volts = Power.

Where a poorly built charger might fail is in design or construction. Does the design cause it to raise the voltage higher and higher if the amount of amps is constricted? Or does a component failure allow this (or worse yet, cause the wall socket 120V AC onto the 5V Dc wires?) But with the basic concept and design of USB pretty commonly established, design flaws are highly unlikely. Shoddy construction? Possible. Certainly on multi-hundred dollar phones and other devices, the device being charged should not be designed to fail catastrophically on standard USB charging.

As I understand, the default for the higher-power chargers is to negotiate up from the basic USB voltage/current. So the “fail” case is the device does not get the voltage or amperage it wants, and either (a) won’t work or (b) takes forever to charge.