Wanted: Free physics lesson

I learned this once, but for the life o’ me I can’t remember…

How do you measure the cost of electrical power that is consumed by a household appliance?

I know that you need to know the $/kwH and the ohm rating (or is that amperage?), but what do you do with them? There was also something about potential and resistance (that’s voltage, right?).

Thanks for your help…


“Anything is peaceful from one thousand, three hundred and fifty-three feet.”

It seems to be simply plugging the right values into the equation. You need to know the $/kWh the power company charges, the wattage of the appliance, and the time it runs.

Say you are charged $0.06/kWh. If you run a 100-watt light bulb one hour, you’re using 0.1 kWh of power. Multiply that by $0.06 and you get less than a cent.

Makes sense… thanks

I guess the next question is, how do I figure the wattage of a coffee maker? That might be where the ohm rating comes into play, but I don’t remember how.


“Anything is peaceful from one thousand, three hundred and fifty-three feet.”

If you know the resistance of a circuit (in ohms), and want to figure out its power draw (in watts), here’s how. First, because you asked for a lesson, here’s the definitions.

One watt is the power produced by a current of one ampere, across a potential difference of one volt.

One ohm is the resistance of a circuit in which a potential difference of one volt, causes a current of one ampere.

Potential, resistance, and current are related in the equation: V=IR.
V = potential (or voltage)
I = current
R = resistance (or impedance)

So to find the wattage of your appliance, the procedure is simple:

Check the label on the back. =b^)

You turn off everything else electrical in your house and only run that one appliance for an entire month.

When you get your electric bill, you will have determined the cost of electrical power for that appliance.

[Ow! Ow! Hey, stop hitting me!]

In case you havent gotten it yet here it is spelled out for you.

P=IV
P=Wattage/Power (Watts)
I=Current(Amps)
V=Voltage (Volts)

V=IR
V=Voltage (volts)
I=current (Amps)
R=Resistance (Ohms)

P=V^2/R
You know the voltage supplied (120V in the US), and the resistance is on the appliace usually.

To really figure it out the wattage exactly needs to be done with an oscilliscope, but these formulas work well for close approximation. Now to find the cost, just multiply the wattage times the time used and times by the $/kWH. Hope this is complete enough


The facts expressed here belong to everybody, the opinions to me. The distinction is
yours to draw…

Omniscient; BAG