On the latest round of advertising slides shown before the movie in my local theater, they’ve added “ECO-FACTS!” Apparently, they could only dig up one Eco-Fact, however, and it was this:
If all Americans unplugged their television sets when they weren’t in use, it would save 9 billion kilowatt hours of electricity per year.
Is this true? What is drawing all the power, just the IR remote sensor? The clock? What about TV’s with that “Energy Star” designation? Are TV’s the biggest offending appliances, or is it just that they’re so ubiquitous?
Or is this all a smear campaign designed to get people to unplug their TV’s and spend more time at the movies?
One source I read said there are 800 TV sets/1000 population.
Rounding 300 million people * 800 sets/1000 people = 240 million sets.
9B kwh / 240M sets = 37.5 kwh = 37,500 watt-hours (in a year)
365.25 days/year * 24 hours/day = 8766 hours/year.
37,500 watt-hours / 8766 hours = (approx) 4 watts, about the same as a standard night-light bulb.
Does a TV pull 4 watts when it’s turned off? Hopefully an engineer type can answer that.
(And hopefully a math type will go over my math.)
From my monitor’s manual (iiyama MM904UT, Energy Star compliant)
Keep in mind that your monitor (probably) doesn’t have an IR sensor, internal clock, and channel setting memory.
IIRC, my 5 years old monitor, which was Energy Star compliant too (Philips 17", don’t remember the model) had a Power Management consumption of 5W.
Also, I believe that the channel memory in most TVs is non-volatile, so it doesn’t require any power to stay alive.