It’ll depend on the TV technology. For LCDs, definitely not, you’re just adjusting how far a crystal twists, and how much already-present light is being let through. Similarly for a projection TV. In both cases, “saturation” is set by the display, you’re just faking it with brightness.
For a CRT, phosphors are stimulated by an electron beam, and more saturated colors (or just greater overall brightness) will increase the intensity of the beam. The phosphors do, in fact, have a limited life (and can “burn in” if the same image is shown for long enough, which used to happen all the time), but I can’t imagine even a 1970’s era TV lasting long enough for phosphor decay to set in, and in any case those same phosphors are used (in combination) on most TVs to make black and white anyway, so saturation doesn’t seem like it would have any effect.
Yeah, burn in happens on CRTs, which is why screen savers were invented. It does not happen on LCDs, which is why screen savers no longer have any real function (unless you are one of teh few who still use a CRT display with your computer), and many people no longer bother with them. For those who do still bother, they are essentially just toys, now.
Yeah, but that’s only an issue for prolonged single images over time, not the moving images a TV is always showing. Having the TV going is like a permanent screen saver – constantly moving and shifting images that don’t stay in one place long enough to cause any damage.
They pretty much fixed the burn in issues for the last decade or so of CRT’s lifespan as a technology anyway. From the mid 90’s or so, you likely weren’t going to get burn in unless you purposely tried to get it.
As for color saturation, I keep it turned down on my TV’s, but not to make the TV last longer, just because it looks better lower than a surprising number of people set it. Doesn’t help that nearly all TV’s are calibrated so that the default color setting looks good in a store’s lighting scenario, not what most people are going to have at home.
Turning down the saturation never made sense. As others have said, burn-in used to be a problem for CRTs and plasmas, though it’s not much of one any more. And the phosphor lifetime is going to be related to the brightness of an image, so it’s not complete nonsense to reduce the brightness.
But saturation just boosts some channels at the expense of others; there’s very little effect on overall brightness. A white (i.e., completely unsaturated) image will reduce phosphor lifetime faster than a highly saturated color picture, because each phosphor is driven to its peak.