Electric plug sockets being replaced by USBs?

You don’t want just a simple rectifier for USB power. A simple rectifier will give you an always-positive voltage, but it’ll be a very messy waveform, and that can have various undesired effects for USB-powered electronics. There’s a world of difference between a high-quality rectifier and the simplest ones.

True. The outlets are presumably little switching power supplies, probably flyback converters at these power levels.

Ok, but my point still stands. DC electronics in wall boxes is not new and does not appear to be dangerous.

A GFCI receptacle contains lots of resistors, capacitors, diodes, and transistors (schematic), but I have never heard of one catching fire due to an internal failure.

As can be seen in the schematic, there is a fuse (F1) between the 120 VAC line and the GFCI’s internal circuitry. Perhaps this is why we don’t hear about GFCI catching fire due to shorts in the circuitry.

If you have just one AC-DC converter near the breaker panel, then you could build a fairly reliable unit to power the whole house. But (there’s always a “but”), you’d have to run wiring to all the USB outlets. That’s going to cost you if you are also running AC to the same outlets. But AC to certain sets of outlets and USB to other sets is not too bad.

You can also then have circuit breakers on different USB circuits. Speaking of which …

The “failure mode” of a shorted or heavily overloaded USB wall wart is to overheat and burn out. If you have individual AD-DC converters at each outlet, then a brief short or something is going to fry the unit. So the unit would have to be replaced. If you want users to be able to swap them out, then certain design properties are required. E.g., the device plugs into a mini-AC socket inside the receptacle. All standardized, of course. Making them individually breaker-protected and resettable adds noticeable costs.

Then there’s “almost failure mode”, when the user plugs a USB splitter into it and powers a bunch of things. Not enough to kill it, but enough to warm it up.

This is different from a dimmer switch, for example, which wouldn’t be expected to experience such excess loads years later. Improper installation should show a problem fairly soon.

Chronic heat inside a receptacle is a bad thing. Insulation gets brittle and breaks off. OTOH, I’ve seen a lot of light fixture receptacles that have been overheated for years and apparently this is considered just the way things go.

My main concern is if such outlets are used to medium size draws. E.g., a modest flat screen TV. Idiots will see the TV plugged into a “regular” USB port and at some point will try plugging it into a PC’s USB port because they look the same. Pop goes the PC.

There are power signaling methods for USB (esp. micro-USB) that lets the device know what sort of USB power it can expect. You’d have to enforce that so plugging a big device into regular USB power doesn’t do anything.

In what universe are people plugging TVs into USB ports for power? I don’t understand what you’re getting at. All USB ports and chargers (excluding counterfeit stuff) have overcurrent protection built in.

Like friedo said, if that’s true for your USB wall warts, then buy better USB wall warts. Designs from reputable companies will current-limit if they’re shorted, and can survive that short indefinitely. The current limit is usually implemented in silicon, so it’s faster and more accurate than a fuse or breaker (though a primary-side fuse is probably still present in case of internal faults).

The same is true for PCs and other devices with real USB ports. There should always be at least a PTC thermistor in series with Vbus, so a short (or overloaded hub, etc.) should never be dangerous, to either the equipment or human life.

When we have room-temperature superconducting extension cords, we’ll replace standard power sockets with USB.

But at the outrageous price of $38 per outlet, vs. the normal price of about 38¢ each – well, you could afford to buy a whole bunch of plug-in USB chargers for that price difference.

Power supply failure fires used to be very common in offices. (Where kitchen, and laundry fires were more rare) In the 80’s. After “power supplies” became common in offices. It would be a mistake to believe that the hazards are the same now.

Do you have a cite for this? The NFIRS data linked above shows that while reportable fires due to wall warts are rare now in absolute terms, the smallest category for which historical data is given (“transformers and power supplies”, which also includes distribution transformers, surge protectors, generators, etc.) is almost twice as common now as it was in the eighties, perhaps because of the increased popularity of equipment that requires them.

There’s a brief section on nonresidential fires, with no historical detail, just the average for 2007-2011. Over that period, nonresidential power supply fires are a little less common than residential power supply fires. So we have 3/4 of the points in the {residential/nonresidential, 1980/now} matrix, and it would surprise me if the missing {nonresidential, 1980} point (a) is “very common”, when the other three are literally struck-by-lightning rare, and (b) has the opposite trend over time from residential.

I wonder how much of the perceived danger of power supplies comes from cases where they failed in a way that emitted smoke, became warm, etc. The safety standards absolutely permit them to do that, as long as anything that could cause ignition remains confined to the product’s fire enclosure. Testing suggests that that’s safe, and statistics seem to confirm that, but maybe that’s not very intuitive.

And not quite the same thing, but here’s a standard for desktop PC power supplies:

So reasonable amounts of smoke, or a non-startling noise, are totally fine.

That’s actually a 2 pack, but it appears the going rate is around $15-20 a piece. You can surely save a few bucks buying extra plug in chargers, but with the built in, you get to keep the power sockets open for other things.

I’m definitely thinking about throwing a couple of these in my kitchen.

Not even going to look at the video. It’s ridiculous. Askmojo did the same shit. It is simply a clickbait video making a ridiculous claim not even respecting the parameters they setup in the first place.

Since this is a relatively new and unproven concept, in my experience, yes, anecdote suffices for me.

I have no statistical data to support my position… Likewise, there is no data to the contrary. Cherry-picked statistics, amalgamated to suit a preconceived conclusion, have less validity here than a determination based upon anecdotal evidence. As stated before, I feel the jury is still out on this matter.

In my years of (anecdotal) experience, I feel stuffing a power supply inside a residential wall box is dangerous, and I would recommend against the practice.

Feel free (I’m sure you do) to do as you please, based upon your own experience.

What do you think is cherry-picked about the statistics that I quoted? NFIRS is the most comprehensive database of fire causes in the US that I’m aware of. Do you see statistics that point in the opposite direction?

Or, if you think there’s something special about electronics that go in the wall: What do you think makes these power supplies different from a GFI, or a dimmer switch, or other electronics that people have been safely putting in the wall for years?

Or can you otherwise describe the problem in more detail? Have you ever designed a power supply, and taken it through safety testing and approval? Do you think there are specific details of the designs on the market now that are unsafe? If so, what? What change would make them safer?

Or do you think that it’s simply impossible to build a safe 5 V power supply that fits inside a junction box? Considering all the extraordinarily dangerous places that people manage to put power electronics (explosive atmospheres, etc.) with appropriate design, that gets ridiculous.

How can it be simultaneously true that this is a new and unproven concept, and you have years of anecdotal experience?

Safety regulators make mistakes sometimes. If you think they’ve made one, it should be possible to articulate with reference to some kind of objective evidence what you think they’ve done wrong. If all we have is positions from our unexplained personal authority, then how could anyone reading know whom to believe?

TommySeven:
Power supplies generate considerable heat, compared to the components that you cite.:smack:
Power supplies tend to catastrophically fail more often than the components that you cite. :smack:
I install engineered (costly) power supplies in explosive atmospheres on a regular basis, thank you very much. They are enclosed in explosion proof boxes, purged (and cooled) with inert gas. It is not uncommon for them to “fry”, so no, it is not reasonable to expect that your $30 cheep-o Home Depot device would be any more dependable, or safer in a residential application.
You need to understand: The UL & NEC are not cutting edge organizations… Quite often they are years behind the times.
YMMV… Good luck.

A phase-control dimmer is more efficient than a flyback converter making +5V for USB. It’s also handling a couple orders of magnitude more power. The total dissipation is similar.

And anyways, do you think the designer of the power supply is somehow unaware that it goes in a box vs. free air? The thermal resistances are not particularly hard to model. Do you think they’re making mistakes? Just randomly going for it because they’re lazy and careless?

I understand that you believe this. Can you explain why? Switching power supplies in general have existed for many decades, so this isn’t new and unproven. There is no reason for anyone–not you, and certainly not anyone else–to trust your anecdotal experience over statistics here. I quoted NFIRS statistics earlier; you are literally about as likely to get struck by lightning as to experience a reportable fire due to one of these supplies.

Do you think these statistics are wrong? If so, why? There are many reasons, malicious and otherwise, for statistics to be incorrect; but will you at least try to explain why they differ so spectacularly from your position?

So maybe you’re saying not that it’s inherently impossible to build these products safely, but that due to cost pressures, you think that won’t happen? That at least could be true. I doubt that it is, though; the pricing (~10x a cheap wall wart) leaves plenty of room for a safe design. In any case, if that’s the situation, then why would the standards change to ban them entirely, vs. requiring higher efficiency, better fire enclosures, etc.?

And again, because it’s better to have referenced facts than to trust random strangers on a message board, here’s Lutron, a manufacturer of dimmers:

In a TI application note, they show some wall wart designs with full-load efficiency around 80%. That’s enough to output 24 W with that same 6 W dissipated, or about five amps at +5V, enough for multiple ports of higher-current USB variants.

A +5V power supply in an outlet box is an ordinary engineering problem. It’s certainly possible to design an unsafe one, or even to write regulations that permits unsafe designs; but to say that the entire class of products is inherently dangerous is ridiculous.