Operating voltage tolerance

What is the plus or minus percentage of voltage tolerance for a household appliance rated for 120v?

It depends a great deal upon what the appliance is.

Devices with switch mode power supplies (aka universal power supplies) are often rated for about 90 to 250 volts. They internally convert, and regulate their output, and so are reasonably insensitive. A well designed one will simply switch off in the face of an input outside specification (at least until the voltage becomes so insane that it is simply overwhelmed and destroyed.)

Conventional filament lightbulbs begin to show shortened lives as the voltage creeps up, and that shortening ramps up very quickly. 20% over and you can assume noticeably short lived (and bright) lights. Even 10% is not good.

Power dissipation in any resistance is proportional to the square of the voltage. (This is why lightbulbs are so sensitive.) Things like kitchen cooking ranges, heaters, and the like will start to show distress reasonably quickly. You can assume the designers have built in a margin, but again, I would hate to see 20% excess (44% power increase). The same problem, and brutal square law comes into play with under voltages. 20% reduction in voltage (36% drop in power) and you are going to see a noticeably less capable cooker.

Most things are not damaged by under voltages, most just not working well, until they might as well not be working at all. On the converse side, motors tend to simply stall. Synchronous motors are particularly vulnerable since they must run at a frequency defined by the line frequency, if there isn’t enough voltage to power them they simply stop. They can’t run at a reduced speed. A refrigerator compressor motor is a classic device that does not cope with under voltages. At too low a supply voltage they stall and then burn out.

Supply guarantees seems to run about +/- 6% to +/-10%, sometimes in combination. Maybe specified as a 5 minute average, which allows for short term excess. I would be surprised at a device that was worried by a 10% variation, possible, but poorly designed (IMHO).

According to Wiki, plus-minus 5% is the standard for wall outlets. The tolerance of the appliance is probably higher. I’ve seen 110-125 a lot. It really depends on the device, so read the label or nameplate.

Switching power supplies are popular for digital devices these days, and they have a wide margin for acceptance. For example, I think my laptop charger can accept 110-250V.

Thank you both for your explanantions. Another somewhat related question. Are step down transformers responsible for voltage variations or is it the power they are receiving off the transmission line? Or both?

I understand voltage will decrease over distances. But for this question lets take a reading right at the discharge side of the transformer.

It’s the system as a whole.
Low primary side voltage will result in low secondary side voltage. Extra loads on either side of the transformer will result in lower than average voltage.

How do you know your voltage is low?

A transformer for power delivery can usefully be thought of as a voltage converting device. (In theory it isn’t so simple, it is an impedance transforming device, but since the impedance of the mains power supply is as close to zero as we might care, the approximation of voltage source and voltage transformation is fine.) Power is voltage multiplied by the current drawn. That depends upon the downstream load placed on the transformer.

So, you ask about voltage variations. Assuming that the power generators are capable of supplying the load, and are not a culprit in the voltage variation, the rest of the problem is in losses in the distribution system. These come in two forms, colloquially termed copper and iron. In bother cases they are a resistance to current flow, either simply in the wire, or losses in the core of the transformer. Core losses are because the iron conducts, and thus, even when a lot of care is taken to reduce the effect, there is still some electrical current flow through the core itself, which just heats it up, plus other losses in the iron as the magnetic field varies. Wires are usually copper, transformer cores are iron. Transformers contain a lot of wire too, formed into the windings - so there are copper losses in there anyway. In both cases the power system is usefully modelled as a prefect voltage source (the generator) and a resistance (the sum of the iron and copper losses). When little current flows, the voltage drop across this resistance is low, and the delivered voltage is close to that from the generator. As the current increases, the voltage drop increases, so indeed as the power delivered to the load increases, the voltage drop increases. The worst of the transmission losses occur in the wires, but some does occur in the transformers. Have a look at any large power transmission transformer - they have large heat radiators, and are filled with oil, to keep them cool. All due to the power being lost inside them, which is directly related to the voltage drop.

A specific issue with power transformers is that they can saturate. That is, because they use an iron core to hold the magnetic field within, they have a hard limit on the maximum field that can be crammed in. Once that limit is reached the transformer is saturated, and ceases to pass any further power. Lightning strikes and solar flares can cause large DC current to flow in the transmission wires, and this can saturate the transformers, which may cause protection systems to drop power to the system. Even if it doesn’t cause them to drop, you can get some weird interactions, and fluctuations. Flickering lights as lightning strikes for instance.

I assume then that it works this way for high voltage as well?

Edit: the two previous posts weren’t visible at the time I posted this question.

Thanks again for the answer to my query.

Yes.
In practice, the grid is pretty well regulated. If you consistently have high or low voltage, the power company can select a different tap on the transformer and get your voltage into spec.

But if it is not consistant then it can become a real problem.

I worked at a set of high rises near the San Jose airport. Building main was 480 volts and we got 480 volts in the winter mild weather. But on hot days the voltage would begin to drop. I have seen it below 430 volts. And we would get complaints from the tenants that their server battery backup alarm for low voltage was going off. The low side would dip below 110 VAC.

I live in rural Thailand. Nominal voltage is 220 V, but in practice it is usually 150 V, sometimes less. :smack: Most things work fine, except for fluorescent tube lamps which have trouble turning on.

Our automatic water pump makes a sad noise, and we have to turn it off, when the voltage gets below about 140V. Thanks for mentioning the refrigerator; maybe I should turn it off too next time we have to turn the pump off.

Once, I think the utility repair man hooked up neutral wrong and we got 380 V for a few seconds! :smack: That damaged two items.