What information and formulas do I use to calculate a particular item’s actual energy use? Watts X ???
Energy is measured in joules which you can calculate by [del]dividing[/del] multiplying the wattage by the time that the device is used. But I strongly suspect that this is not the information you’re after. What are you actually trying to do?
ETA: wrong arithmetic
Figuring that the OP is used to US units, I think you are looking for :
1kW = 3412 BTU/hr
or 1 W = 3.4 BTU/hr
If it’s electricity you’re wondering about, get a watt-meter.
Energy can be measured in many units: watt-hours, joules, calories, BTU, etc. They’re all interconvertible. Different fields/industries use different units based on tradition and ease of use and the such. Home electricity is typically measured in kilowatt-hours per month for total household use or watts per appliance if you want to know how much your TV uses, for example. Natural gas is usually measured in therms. Heater output is usually measured in BTU for whole-house or gas heaters, or watts for electric space heaters.
I’m trying to figure out the kilowatt hour use, ultimately. I’m trying to determine from available information about my items what they are contributing to my kilowatt hour use.
Okay, I found a formula, but it must not be complete, at least for everything.
My fridge is 115 volts, 8.3 amps - multiplying one by the other gives the watts, 954.
So my fridge is 954 watts.
I run it 24 hours daily.
24 hours x 954 watts, divided by 1000 = daily Kilowatt hours, X30 = monthly.
Which in this case means my fridge is slurping up 686 kilowatt hours per month? Is that likely? Another page stated that the biggest energy suck in an average home is the water heater, at 192 KWH, followed by the dryer at 80-something, and then the fridge at 54KWH.
Clearly there is a piece of all this I am missing in order to determine what MY fridge is using, or at least approximately what it is using…
I see another page referring to surge vs. continuous…
What does it all mean??
The power consumption of the fridge is most likely the maximum if you set the temperature dials all the way down to the coldest setting. Long story short is that your fridge is not running 24/7 only a small amount of time per day, like after you open the door to take things out, and when you put things in that are room temperature. The rest of the time the fridge compressor will come on sparsely to keep the temperature at the set point because room temperature air warms up the insides a slight amount.
Sorry, to answer the op, you would need to buy an ammeter designed to measure current going into the unit and calculate from there.
That’s the apparent power which for a motor load (the motor driving your fridge’s compressor) is unlikely to be the real power.
Also, the real power is only consumed when the compressor is running
Actual energy usage will depend on a lot of factors (motor and compressor efficiency by design, degradation of compressor efficiancy by loss of refrigerant, quality of insulation, how often you open the fridge’s door).
Practically you cannot calculate your fridge’s energy usage, you need to measure it by inserting a power meter such as e.g. this one into the circuit.
I have a power meter, similar to Mops’s above. Mine is attached to the mains input so it shows, via a radio link, every time an appliance is switched on, or when the fridge kicks in.
It is interesting and educational to see where the energy goes, for a while at least. I had some halogen spot lamps that used a lot of power for example, so they are now LEDs. Not much that you can do about things like the electric kettle except not boiling more than is needed.
I could also see if anyone had left lights on upstairs which was handy and might be really useful for anyone with teenagers in the house.
Your amps measurement was the measured at the peak (surge ? operating ? ) usage, and not the average…
Your refrigerator has a thermostat, so it runs its compressor some of the time (when its warmer than maximum allowed) and doing nothing, consuming zero power, a lot of the time… (when the thermostat has decided that the contents are below the minimum …) … The range built into the thermostat is done to ensure that the compressor is on turned on and off too fast… eg one for a second and off again for 3 seconds, as that would wear out the parts far faster… (each start/stop causes more wear than many minutes of simply turning at constant velocity … you’d need the mythbusters to be sure of how many … )
Watts are Joules per second.
So Kilowatt Hours are 3.6 Million Joules.
The only way to know what the fridges uses is to measure energy (as Joules or as Watt Hours) over a long time…
And even if you had all of the relevant information for the fridge, or maybe even one that the guys at the factory experimented with, you still couldn’t calculate it, because it’d depend on difficult-to-control things like how many times per day it’s opened, and for how long each time.
You can also check the energy star listings for your fridge to see if the gov already estimated its power draw:
Reply’s link gives a range of 150-855 kwh/year, depending on size and efficiency. If your fridge is less than 20 years old and you’re not getting a number in that ballpark then you’re probably doing it wrong.
power consumption meters like the “kill a watt meter” are really nice. the least expensive are under $20 and you would need to use a heavy appliance extension cord (to see the display) for something like a fridge. you can run it for a day and it will give good average use results.
Rough rule of thumb - heaters use the most, big motors next (fridge compressors, etc.), then lights (not CFL or LED), then entertainment appliances.
This reminds me of some calculations I’d do for my computer. The math said that my laptop computer was using something like 54 W according volts/amps/etc. But my laptop has an actual energy monitor in there. For basic web browsing, I’m only using about 20-25. I’ve never seen actual energy consumption top 45 W. Really, this isn’t surprising, since few devices would be designed to run at 100% nonstop. You want to build it to use something less than the maximum.
So an actual meter is the only way you’re going to get useful information about what appliances are doing. Math will only get you some theoretical maximum.
To recap,
- Multiplying the voltage and current listed on the nameplate is almost never a good method for calculating power, as you’re calculating the apparent power (VA) not the real power. It works O.K. for a light bulb or heater (and only when they’re on, obviously). But it doesn’t work well for anything electronic or with a motor in it. And it doesn’t work well for anything that is powered all the time and has a duty cycle less than 100%, such as a refrigerator. Having said that, it is certainly possible to estimate the average real power by doing some additional math that includes duty cycle and an estimation of power factor.
Other than that, you have to make some real-world measurements.
-
You could measure the voltage and current using $20 handheld DMMs and multiply the two numbers, but you’ll probably get the wrong answer. For two reasons: 1) you’re assuming the phase angle between the voltage and current is 0 degrees, and 2) the waveforms may not be pretty-looking sine waves, and your elcheapo DMMs assume they’re sine waves. (Note that this is more of an issue for the current waveform, as the voltage waveform does usually look like a nice sine wave.)
-
You could measure the voltage and current using $80 handheld DMMs that measure the “true RMS,” and multiply the two numbers. But you’re still assuming the phase angle between the voltage and current is 0 degrees.
-
You could measure the voltage and current using $80 handheld DMMs that measure “true RMS,” and multiply the two numbers. And then measure the phase angle using an old oscilloscope. This will give you a fairly accurate answer, but is not very elegant.
Also note that the maximum current a handheld DMM can measure is not all that high. Most can only directly measure up to 2 A or 3 A. The highest I’ve seen is 10 A. May also need to use a shunt or clamp around ammeter…
So unless you have access to a scope, using handheld DMMs pretty much sucks if you’re trying to measure real power with any degree of accuracy. So…
-
You could use a fancy new Tektronix digital oscilloscope and clamp around current probe to simultaneously measure voltage and current. The scope will multiply the two waveforms and give you power. Pretty awesome, but I hope you have deep pockets.
-
You could use a power analyzer. This is the “ultimate” approach. These units use high speed ADCs to directly and simultaneously measure voltage and current, multiply them, and calculate average power. (We have one at work.) For Joe Homeowner, the Kill A Watt[sup]TM[/sup] (as linked to by Reply) will suffice nicely. But as others have noted, you still have the duty cycle issue to contend with. And as noted by Chronos, there may be quiet a bit of day-to-day variability.
The Kill-A-Watt will totalize power consumption.
Just run it for a few days, and it will do all the work for you.
The Kill-A-Watt works this way as well. The ADC doesn’t have to be particularly fast–a kilosample/sec or so is totally sufficient. Well within the range of cheap ADCs.
I own the Seasonic PowerAngel, which uses the same guts as the Kill-A-Watt, and they sent me technical information verifying that they integrate V(t)*I(t). They can (and do) infer the power factor from this as well.
Also, ask around (maybe at your city hall) about energy rebates/incentives. Where I live, for example, there’s a government-funded group that will come to your house, measure your energy usage across various appliances, and even give you a whole new refrigerator for free if yours is too old and energy-inefficient. (This is California, where PG&E has an energy efficiency mandate of some sort, which ends up with small local groups being paid by the gov and PG&E to help homeowners and renters achieve higher energy efficiency.)