Like many of you, I spent Christmas with family, and my grandmother (and by grandmother I mean my grandfather’s latest wife) accused my mother of wasting energy by leaving our LCD TV plugged in at all times. Apparently, most LCD and plasma TVs draw a small amount of power continuously, and this allows the TV to turn on quickly, rather than having a lengthy boot process. I don’t dispute this; the TV clearly takes longer to start up when it has just been plugged in than when it has been plugged in for a minute or so, and I’ve heard something like this before.
However, Grandma believes that the TV, when off, draws more power than our refrigerator. I believe this to be crap. My refrigerator has to be drawing at least five hundred watts, while it seems to me that the TV is unlikely to draw much more than that even when on. Can anyone help me out here? How much less power does the TV draw when it’s off than when it’s on?
According to the specs for a 37" Panasonic LCD, which can be viewed here, there’s 220W consumption while it’s on and 0.2W consumption in standby (plugged in but off). I would expect that to be fairly typical, though older models might well use somewhat more in standby.
The TV is only about a year old, and I assume it was a fairly new model at the time. And it is a 42" JVC, so I imagine that it’s comparable, or at least within a couple watts of correct, which pretty much answers my question. Thanks!
One thing that isn’t detailed in the graph is the cable box. If it’s a DVR box put your hand on it when it is off. Be careful not to burn your hand! Those things have to be a huge drain.
I’ve seen that claim about plasmas before, and it’s ridiculous. My 5 year old 50" plasma consumes 0.9 W when in standby. And a new one is 0.2 W. All I can figure is that the people who wrote that article turned off the cable box, but left the TV on but at a blank screen and measured the power consumption then.
Plus you could point out to Granny that the ‘wasted’ electricity is dissipated as heat. Assuming your TV is inside a house, that heat is not ‘wasted’ at this time of year, but contributes to keeping the house warm.
My new Toshiba LCD has the option to set the standby power consumption depending on if you want the TV to turn on right away or you can wait 10 or 15 seconds. I choose to wait.
My owner’s manual recommends not cutting the power from the TV regularly. In other news, a few weeks after we got it and did the kill-the-power-when-off thing, a circuit board fried and had to be replaced, whereupon the repairman recommended that we give it power at all times. These may not be related, but if the drain is only .2 watts, I won’t take the risk.
However, TVs with the option to power on slowly seem like a wonderful solution to the problem.
Mine certainly does. You can practically make toast in front of it when it has been on a while.
When it isn’t on, it consumes nothing as it is OFF. We have to switch it off as it responds to the controls for the stereo, which is no fun at all.
Unless by “off” you mean no power going to the unit, it’s never ‘consuming nothing’. At the very very least, it needs to power the IR receiver (and related circuitry) so that it can receive a power on command from a remote.
The “vampire” electrical usage is a hype. I believe in conserving energy, but the claims of 40% of total usage are absurd. 40% is what an Oprah guru claimed.
If one has 20 watts of “vampire” load (which is a lot!), 24 hours a day, for 30 days, it is not anywhere near the high cost as some claim.
My 42" Samsung plasma, made 2 or 3 years ago, only draws a few watts while off. I expected it to be 10 or 20.
I went around with a wattmeter and measured many different things around my house. The biggest surprise I had was a halogen floor lamp sort of like this, it used like 260 watts while running at it’s brightest setting.
Plus the $2.11 is probably 20 times what it really costs. The calculation was done for 20 watts. People in this thread are talking about less than a watt of standby power for TVs.