Electrical Question (measured amps of household devices)

I bought a watt meter to get a sense of what kind of power my various devices are using. So far, the measured watts seem to match pretty closely the theoretical specs given by each device manufacturer. But, oddly, the amps don’t seem right. My understanding is that watts/amps = voltage. When I run that equation (my house is 120v), I get amps that are lower than the measured level.

Am I missing something fundamental here, or is my meter wrong?

Seem likely that it should work. Maybe you should give some examples, e.g. power and current readings that don’t match. Is it off just a little or quite a lot?

The amperage rating that you’ll usually find on appliances is the maximum that that appliance will ever draw, not necessarily what it draws most of the time. You need to plan for the maximum when designing cords and the circuits in the wall and the like.

Carl Pham: Here’s an example. My receiver draws 37.5 watts when on but not playing audio. But the watt meter says the amp draw is .5 amps. That’s off by quite a bit, no?

Chronos: You’re talking about the specs right? I’m talking about the actual measured amps simultaneous with the watt reading.

While that’s generally true for DC circuits, the power factor (due to reactive loads) in AC circuits can throw that off.

It’s probably “power factor.”
On AC circuits, if the current is not in phase with the voltage, you can’t simply multiply volts*amps to get watts. For inductive devices (like, the transformer in your receiver), the current will lag the voltage by some significant amount.

ETA: sniped!

You can’t link me to wiki on SOPA blackout day!

Can you explain it to me in 5th grader terms, preferably with analogy to a garden hose? :smiley:

Sorry, I keep forgetting not everyone blocks javascripts. Been using wikipedia all day. Anyway, here’s another explanation of power factor. Not sure how I can turn it into a garden hose analogy except to say that besides the amplitude of the load, there is also an angle which determines how out of sync the resulting current sinusoid is with respect to the driving voltage sinusoid.

Does your watt-meter happen to have a VA measurement setting? That is actually Volts x Amperes. The power factor in base terms is real power (Watts) divided by VA.
Some meters will even give a power factor reading. Whether it does or doesn’t, it should also tell you somewhere exactly how it’s measuring power (e.g. average vs. instantaneous, etc.)

Incidentally if it is a power factor issue, then you only need to worry about the watts reading. A simplified way of stating it is that with alternating current, there can be some amount of current that doesn’t get used by the load. It merely flows back and forth. In most, possibly all utility systems you only get charged for the real power dissipated in your house, not for this extra power. The utility company still has to have the capacity to deliver it, even if it’s temporarily returned to them each cycle.
Of almost insignificant concern to you, but of great concern to the utility, is that there are real losses due to the current flow since the wires aren’t perfect; extra current means extra loss. So you would use a tiny bit more power if you have a lot of loads with poor power factors.

Very interesting. I lack the fundamental understanding of electricity to really follow the power factor stuff. But I’ll spend a few hours on HowStuffWorks and try to sort it out.

The meter does have a VA measurement – which seems to read a bit higher than the watts, which if I understand correctly is consistent with the power factor issue.

True since you specified “house”, but for a big industrial operation like a factory, or possibly even just a large office building, the utility company will charge extra for it. So it’s to a business’s advantage to buy and install devices to compensate for the power factor, so they won’t have to pay the power company for that “extra power”. Basically, if your load is overall inductive (as is the typical case, since motors are inductive), you stick some big capacitors in the circuit, and vice-versa if your load is overall capacitative.

Also, older (ten years or more) switch-mode power supplies that power a lot of electronic devices had pretty poor power factor in the capacitive direction as the mains are fed directly into a rectifier and capacitor circuit. Not really an issue on a house-by-house basis but add them all up and it becomes a pain in the ass for the power company. Regulations were put in place about ten years ago that dictated a maximum allowable power factor.

It would be interesting to learn how old the OP’s receiver is.

Did you measure the house voltage? In real life, 120 V house current can vary from 110-125 V. My mom’s house consistently measures 125 V, and mine reads 115 V.

Is your watt meter measuring watts or watt hours? Watt hours are equivilent to how much water is in the bucket the hose runs to in an hour.

Your fridge may draw 5 amps in defrost mode, 3 amps in cooling, and none at all between cycles. The popular Kill-a-Watt measures the watt hours in the time the device is plugged into it.

Yeah, the real voltage is like 120.5.

It measures both. For the purposes of this thread, I’ve been talking about watts.