How to calculate energy use...precisely?

Thanks; wasn’t aware of that.

I wonder why things like refrigerators, dryers, and water heaters do not incorporate a simple power analyzer into the design? It’s done on vehicles (mpg). I wouldn’t think it would add an appreciable % to the price.

The Energy Guide label obviates the need for a meter.

The average consumer understands “miles per gallon” on their car and knows how much they are paying per gallon. However, most do not understand volts, amps, joules, watts, or kilowatt-hours, and few even know what their utility company charges per kWh. So a meter might add $10 to the cost of the appliance, and would provide little value. The Energy Guide sticker makes a few reasonable assumptions and expresses the energy usage in dollars per year, a number pretty much every consumer can understand. The assumptions (usage, temperature settings, electricity cost) may not exactly match each consumer’s situation, but they will allow a comparison between appliances when shopping.

This was something that surprised me when checking all my power usage with a Kill-A-Watt. My fridge used less than 60W on average, far less than I had been led to believe.

It is also surprising how little modern DC power supplies use–the chargers for your phones and other portable devices. While the old transformer style suck up the watts while not in use, the current variety of switching supplies apparently use less than a watt when idle.

Presumably, the OP is interested in this information because she wants to minimize her electric bill. Which means that she doesn’t actually need a perfect measurement; she just needs a measurement that’s as good as the meter on the side of her house that the electric company uses. So the “correct” way to deal with things like phase angles and power factors and the like is to deal with them the same way the power company does, which for a residential user is probably pretty simplistic.

I wonder if it wouldn’t backfire as much as it would help.

Person A likes to stand at the fridge with the door open for long periods of time and spends $20 of electricity.

Person B has two kids, but they eat out all the time. They only spent $15.

Which fridge is more efficient? We really have no idea. It could be that A has the better fridge, he’s just using it like a moron.

Good points.

I can imagine “in the future” incorporating some smarts into a home’s circuit breaker panel. The panel would have a built-in power analyzer that measures and digitizes the current on each circuit, along with the 120/240 V voltage. It would compute average power on each circuit and send it to a computer via Wifi or something. The most power-hungry appliances are usually on dedicated circuits, thus such a system would allow the homeowner to easily keep track of the power used by each of these appliances.

There’s already such devices out there (look for things like the TED “the energy detective” or “whole house energy meter”). Google even started a Google Power experiment where they’d integrate that into your account, before they gave it up as before its time . Noone really cares. Electricity is so cheap it’s hardly noticeable unless you’re running electric heaters or old-fashioned incan bulbs a lot. All the other stuff, TVs, fridges, cell phones, computers combined usually add up to like $20 a month or less.

Whole-house monitoring doesn’t always have the specificity that you need for residential behavior change; it just tells you in real-time/near-real-time what your utility tells you at the end of the month. Many areas already have Smart Meters and you can look at its readout (or interface with it using infrared transmitters and such) to achieve this same thing, but nobody really cares.

It’s an expensive solution in need of a problem, IMO :frowning: Though it is pretty cool!

Uh huh. My cable box alone, when off, draws 80 watts. That’s 58 kW-h per month, or $15 at my marginal rate of $0.26/kW-h.

Wow, sorry to hear that. Both your cable box and your electricity price sucks :frowning:

My computer doesn’t even use that much power when it’s running at full tilt. How did you measure that?

My computer is about 100 watts on, and <1 in sleep mode, so that at least isn’t too bad.

I used a Seasonic PowerAngel–basically a clone of the Kill-A-Watt mentioned above. Just plug it into the wall, plug the device into it, and see the numbers. In this case I could just see the instantaneous power, though as others said it has an integration mode for accurate measurement of devices with a duty cycle (refrigerators, etc.).

My receiver was also pretty bad in standby mode (50 W IIRC), but it at least has an auto-shutoff that puts it into a deep sleep with negligible power. Still, I wouldn’t have known until I just went around the house with the meter, measuring everything I could.

My pricing information is a tad out of date, actually. I switched from a straight tiered plan to a time-of-day plan last month and haven’t worked out the details yet. The top rate is now $0.33/kW-h for peak hours and only $0.11 for off-peak. 88% of my usage is off-peak so the plan has saved money so far. But devices with high standby power kinda screw me over because they use power at all times of day. So I’ve really been looking around for the various power vampires.

At any rate, the lesson is really that you need to actually measure these things if you want to be sure you aren’t wasting money. There’s a lot of really badly-designed garbage out there.

Surge protectors slay energy vampires. Plug into the surge protector instead of the wall socket. One switch turns everything off. Simple and efficient.

We put up with our Satellite receiver box eating power. killing power means a boot up and that takes almost 60 seconds. The device isn’t designed to be totally powered off daily. So we just turn it off with the remote.

Related, cable tv boxes use a crazy amount of power for no apparent reason.

When I see the calorie listings for foods I often wonder just how accurate it really is. I imagine they go by some average rating by weight, but it seems to me that if a grape can range from very sour to very sweet, there must be a lot variation in its sugar content. I wonder how precise the calorie counts on labels actually are.

I can’t read the article right now, but I did find this rebuttal by the NCTA, in which they claim set-top boxes average just 17 watts over the course of a year. I can’t think of a reason it should use any more than that, unless it’s a DVR with an always-spinning hard drive, or an ancient thing that employs a transformer-based power supply (instead of a more modern SMPS).

I don’t have the data to back this up, but I’d guess that the differences between two similar-sized grapefruits is trivial compared to, say, whether you have that soda with your lunch.

In non-processed foods, differences in recipes – sauces, oils, serving sizes, etc. – probably outweigh the difference between Broccoli A and Broccoli B. In processed foods, the additives (like sugar) probably outweigh the intrinsic caloric properties of whatever food you’re wondering about. This article says the FDA says the actual calorie count has to be +/- 20% of the stated amount, but I don’t know how they verify or enforce that.

For calorie counters, I think the awareness/limitations of what they’re eating, and recognizing the need to exercise to burn more calories, probably matters a lot more than the 50-100 or so extra kcal you’d get from a twice as sugary grapefruit.

There’s no reason it must be that way (and even 17 watts is absurd), but that doesn’t mean it isn’t. The rebuttal doesn’t break down the kinds of boxes, but it could easily include a large portion of basic converters (no DVR or guide features or anything, just basic channel decoding). These could bias the figures so as to dilute the inefficiency of the full-featured boxes.

Their point that the “500 watt” rating on the back is irrelevant is a valid one, though.