I heard that you should turn the TV off by the remote and leave it on standby. It was said that turning on and off at the mains continually was damaging the system on the latest models.
Why do you want to turn the TV on and off via the mains?
Nonsense. If by “the mains” you mean to disconnect the power source, that’s the same as pulling the plug or flipping the switch on a power distribution box. TVs are no different from any other device, and they are designed to be switched off. Here, I’ll do it to my compu…
Using the remote control or the button on the TV puts the TV in a standby mode instead of turning it off. The standby mode still consumes a few watts of electricity continuously. Standby mode allows the TV to start faster and it powers the sensors to detect when you’ve pressed the power button on your remote in order to turn the TV back on.
The same is true of most modern electronics.
I can remember when the standard advice was that you should never leave a TV plugged in overnight, because it was a fire risk.
Of course, the TVs are different now, and the fire risk probably much less (though not non-existent), but by leaving it on standby you are definitely wasting electricity, which is bad for both the planet and your own finances. I have never heard that it damages the TV to turn it off at the mains. I have heard of campaigns to persuade people that they should turn it (and other similar things) off. The term “vampire appliances” has been bandied about.
So is a table lamp, an electric razor, a coffee pot or a battery charger. Do you unplug all of those before beddie-bye each night? Why is a TV any different?
By “mains” do you mean:
[li]The switch on the TV itself?[/li][li]The switch on the surge protector that you hopefully have your television plugged into?[/li][li]The circuit breaker controlling the outlet intowhere you have your televison plugged?[/li][/ol]
The only one that I could conceivably see possibly “harming” your television would turning it off by using the circuit breaker. There’s the possibility that when you flip the breaker back on , there could be a surge. However, there would be a slim chance of this occurring (your being injured by an arc flash from the breaker would be greater)
I think British outlets usually have individual switches.
Flipping a circuit breaker halts the flow of electricity into the “Hot” (black) wire.
Flipping a light switch off halts the flow of electricity into the “Hot” (black) wire.
Using the C/B as a switchis no different (electrically) than sing a light switch. They are not designed for repeated cycles, so don’t make a habit of it.
If you plug your TV into the outlet controlled by a wall switch (a common feature in cheap houses, where the builder wants a “light switch”, but doesn’t want to pay for wiring and installing a fixture), that switch will do exactly the same as the C/B.
Electricity is NOT like water - it only goes where it is pulled.
Actually unplugging it will disconnect both sides of the circuit. Still no difference electrically.
Because an old fashioned CRT television used much higher internal voltages, presumably, as well as running much hotter when on than most other appliances (especially the older ones with valve electronics). So yes, despite your snark, old fashioned TVs were different, and more dangerous, than most other appliances, then and now, and in Britain we used to get PSAs telling us to not only turn them off but to physically unplug them. The danger may have been a bit overblown (and many people may not have bothered, in practice), but that was the message from the authorities. Modern TVs, like most other appliances, do not use such high voltages, so presumably that danger is much less. On the other hand, if they are left in standby mode they are wasting power than an old-style switched of CRT TV would not be wasting. (Of course, the CRT TV would consume a lot more power while operating than a modern flat screen does.)
And yes, as Lord Feldon points out, British electrical outlets (which, remember, run at twice the voltage of American ones), normally have individual switches on them. However, the PSA advice was to not rely on that switch, but to physically unplug.
The Australian government issued one of these gadgets a couple of years ago (for free) and now I heard, on the radio, that there are complaints from people who blame the breakdown of their TV on these gadgets, saying the constant turning on/off damages the set. Iv’e had one on my TV for about two years without any problems, but wondered if there was any truth in the story.
BTW, when I said turn off at the mains, I should have said powerpoint outlet.
But for modern TVs, the standby usage is very low. My giant-ass Panasonic plasma uses 0.2W in standby, so a full year of standby would be 0.2 * 24 * 365 /1000 = 1.75 kWH, or a whopping 35 cents worth of electricity.
Well, I don’t think there’s any chance of ‘cumulative’ damage from doing this nor is it an issue of electricity usage or safety. However modern TVs like most electronics don’t just ‘power on’, they ‘boot up’. IOW many (or most) have an actual operating system in their firmware, albeit a much simpler one than a computer or smartphone. Still, if you merely cut the power to them over and over rather than shut them down normally it increases the chance of causing a ‘glitch’ in their software, at least more so than normal.
Suffice it to say there is definitely no advantage to doing this, and there can only be a potential disadvantage.
Thanks, it seems I’d better just use the standby mode. It’s also a PITA when I want to record something and the TV turns off after a couple of hours.
the power off consequences will vary on the brand and model.
there is the boot up to go through and the set up. it may or may not retain all set up parameters and channel scans in its memory. it may take a few minutes to restore things. i’ve seen different sets retain different parameters. even if the set will retain data like a channel scan it still can on occasion loose it.
This is purely anecdotal, but one time I started using a similar device to cut off power to peripherals when the computer was off. Which meant the power to the LCD monitor was cut off whenever the computer went into sleep mode. The monitor (a nice ASUS 24",only a few months old) died about a month later. This may be a complete coincidence, but stopped using the power cutoff device just to be safe.
It’s not as if the device worked very well anyway. Modern computers consume very little power when idle. It was difficult to set the power sense threshold to distinguish between idle and sleep.
The real issue here is that the power supply of a piece of electronic equipment undergoes much more stress when it is switched on than when running continously. This is partly due to heavy currents drawn to charge up capacitors, though big transformers can also take a gulp. A marginally designed power supply might well fail if subjected to frequent on- off cycles. Rectifiers may die, internal fuses may pop.
The turning-off process causes no problems.
I work with complicated and expensive electronic testgear, and the invariable rule is to turn it on and off as little possible. If that means leaving it fully powered for a week (there is no standby mode) so be it.
I will agree here. I’ve seen power supplies work for years until the one day you cycle power to them. Modern TVs have internal power supplies that produce the voltages actually used by the display and the electronics. Cycling power to them is not doing them any favors.
With this said, you save money by cutting power at the plug, and TVs are very cheap these days. In the long run, it’s probably about a wash.
I have a manual power bar that controls power to my PC monitor, an LCD TV, my PC speakers and my desk light.
Been using it for years - I’m on my 3rd monitor, just because they’ve gotten bigger over the years (current one is LCD) and the TV was first added Xmas '12.
Would be very bad with a rear-projection or other tv with a bulb where the fan continues to run for awhile after turning it off to cool down the bulb.