Do black and white shows take up less space on a DVR?

I ask for three reasons: First, if I desaturate a color image in photoshop it will decrease the file size. Second, when I save HD programs on my DVR they take up a lot more space than regular TV. And third, I just saved like ten hours of a Twilight Zone Marathon and it barely made a dent on my DVR’s hard drive. I think I may be imagining things though.

Depends on how smart your DVR is. It’s true that B&W video can be compressed more than full color video without losing quality, but your DVR may be using constant (or average) bitrate encoding, in which case you’d get better quality than color video, but use the same amount of space. If it uses constant quality encoding, then B&W shows will take up less space, and so will shows with very little motion.

I have a Dish DVR and record “What’s My Line” every night. It’s a 30 minute show and it takes up 30 minutes of space. How would the DVR know to compensate for color commercials?

We’d need to know which way this particular recorder is recording. The Dish DVR only understands time, at least on a user-facing manner, but the OP’s apparently understands file size. (Are you using a commercial unit, or is this a home-grown setup using something like MythTV?)

Different channels are encoded at different bitrates. Premium movie channels will get the highest available rate, but something like the Weather Channel may only get enough of a bitrate to yield a tolerable relatively low-res image. In MP3 files, the corrollary would be “good enough for spoken voice” vs “better than CD quality.”

My TiFaux measures by percentage of total memory. A half-hour network TV show, with commercials, takes up 3% to 4% of my disc space. A 3-hour technicolor musical on TCM takes up 1%. It has to be the commercials, right?

Commercials? To the disk, they’re just more bits.

Good luck on getting your cable or satellite provider to say what the bitrates are, but the network programming is probably right up at the highest rate they use for standard-def programming. TCM, not so high, apparently. They can probably just push up the compression factor and you’d never notice it much on an old movie. I couldn’t find hard cites, but did see an anecdotal item or two that implied that premium movie channels and local broadcast channels get the full possible bitrate (around 4 megabits/second for one standard-def video signal) and “basic cable” channels get half or even a third of that.