Video quality: why do the same TV shows look different on different networks?

Actually, a two part question.

The first: turn on a TV in the U.S.

Flip the channels on the networks (ABC, NBC, CBS, FOX) while original programming is on (soaps work best, sitcoms next, dramas last).

If you watch TV long enough, you can usually instantly tell what network you’re watching by the image quality. The soaps, the sitcoms, the dramas all have a certain “look” to them shared by most of the other programming on the network, distinct from the shows on other channels.

I asked someone who ought to have known about this (Hollywood TV guy), and his answer was that each network uses their own production houses, regular equipment, lighting guys, etc., and thus their shows all have a similar “look.”

I didn’t buy it then, and I still don’t buy it now, because of the SECOND part of my question:

The other day, I saw an episode of House (FOX show) running on basic cable, forget the channel, but it looked nothing like the show. Not as sharp, somewhat muted. Could this just be old tape? Or could it be that there’s something more to the “network differences”?

My vote is “something more”-- I think it’s got to have something to do with the broadcast they transmit, a difference effected by frequencies or somesuch thing.

So, Dopers-- any learning here?

I know that for sports in HD CBS used to look a lot worse than Fox or ABC or NBC. I figured it was based on the equipment they used. Of course your local station also impacts the quality of a network show.

The actual feed sent to the cable server can be adjusted by the station. The Food Network, for example, insists on stretching out 4:3 content on its 16:9 HD feed. Presumably other networks have their own settings for, say, color adjustment.

Also, the cable feed you get is compressed, sometimes highly so. There’s only so much bandwidth. I could certainly imagine the cable company compressing a basic cable station much more than Fox.

The basic cable company might not be playing the show off the very top-end equipment, either.

Okay-- forget I ever mentioned cable, this effect is outside of cable compression (and predates it, in fact). Standard old over-the-air signals for network TV work the same way. Shows run once on one network look different on another network (JAG on NBC looked different than JAG on CBS).

Or am I insane for noticing this?

I grew up watching the New York City flagship stations of the Big Three networks, as well as their Connecticut affiliates. And I’m fairly sure that each network (whether on the NYC or the Connecticut station) had a particular “look”; ABC was brighter, NBC more diffuse and CBS was different somehow. I think that this difference may have come up in a previous thread here.

There are technical standards every network is supposed to adhere to, but the reality is that not all of them interpret the standards in the same way. If ABC, NBC and CBS broadcast the SMTPE color bars pattern they should all match. But, as other people have mentioned, cameras differ. One network may prefer Sony, one Panasonic another Ikegami. The camera operator adjusts any of a dozen controls to capture the image, but there is usually a “house standard” for adjustment. Then, there is a “shading” person who adjusts the iris and other controls while the camera operator is framing and focusing the shot. Each camera has “proc amp” controls as well.

Once it goes out onto the network the fun really begins. There are a dozen places where the quality can be impacted before it even gets to the cable company where they can fuck it up royally by over-compressing it.

I agree that there seem to be visual differences between the networks, though I can’t put my finger on it, either.

I see it mostly in sporting events. A daytime NFL game on CBS just has a different look to it than a game on Fox; a Sunday night game on NBC looks different from one on ESPN (or ABC, before that).

If you’re watching in HD, ABC and ESPN are 720p/60 while CBS and NBC are 1080i/30.

There’s a reason they call it “Never The Same Color” - Two engineers may set up their equipment to NTSC specs, but they can still look quite different.

Oh, and thank you for bringing up nightmares of camera shading. We used to have Panasonic cams with roughly 25 possible adjustments under a panel on the camera, plus several more internal adjustments that needed to be done with test cards and a scope.

It wasn’t just on the East Coast. To be exact, I always wanted to ask this question, but never got around to it. In the Los Angeles market, NBC and ABC looked bright, but CBS seemed to have a almost greyish haze on all the shows. It was as if you were watching an old show despite it being new.

Interesting…but I’m actually not watching in HD. :slight_smile:

Yeah, but it’s shot in HD and down-converted to NTSC (480i/30). The original source matters.

Again-- this isn’t an HD thing, it predates (and survives into) the HD era.

During the analog era, there were many more variables to the signal from the preferences of the chief engineer to how much wire there was in the production environment. For instance, there is a part of the NTSC signal called “setup”. It’s the difference between absolute black (0% signal) and the black that is actually transmitted. Some old analog sets couldn’t handle 0 black and would lose sync. So they added “setup” and re-defined 7.5% of the full monochrome video signal as “black”.

Look at the SMPTE chart I posted earlier. In the lower right corner there are five stripes that are different shades of gray. The leftmost one and rightmost ones are “black” in NTSC. The three in the middle are (from left to right) are 0%, 7.5% and 11%. Its intended as a quick way of setting up and monitor and adjusting the playback of a television source. Ideally, you should adjust the brightness (or “black level” if you are a TV engineer) so you can see the difference between the 7.5% and 11% stripes, but not the difference between the 0% and 7.5% one.

Monitors drift with age. And, so does human vision. And sometimes, as with the “grayish haze” mentioned by Jormungandr, the human in control may be aging and losing visual sensitivity and may adjust the proc amp controls to compensate. Or set the monitor controls incorrectly and is over-compensating for that.

Also, the “setup” had to be added to the signal only once. If it was added twice, the brightness of the black gets doubled and the whole picture loses contrast.

I agree with you wholeheartedly. WNEW, WOR, & WPIX had distinct looks as well.

While visiting (and now living in) South Florida, the variance wasn’t as apparent, and has blurred to the point that I recently couldn’t win a bet with my wife that I could tell what major station was on without looking… :mad:

The differences in the way the broadcast networks looked and sounded are something I’ve noticed since I started watching TV as a toddler back in the late 60’s. It didn’t matter what affiliate I watched, the differences were always there. For example, CBS always sounded “crisper” (i.e., the sound seemed to set higher treble level) than either ABC or NBC.

Anyway, I had a thread on this subject years ago and, as I recall, it all boiled down to each network’s different technical settings. I wish I could be more precise but I’m not an television engineer so I’m unfamiliar with most of the terminology.

I’ve wondered about this, only with the comparison being between the same show on American and Canadian television stations. Canadian broadcasts seem less saturated and colorful compared to US broadcasts, as if a television’s chroma or color level control is set at 40% rather than 50%. I’ve seen it during the Olympics, when the same sport was being broadcast on both CBC and one of the NBC cable stations at the same time.