Why does "live" television look "live"?

We all know what “live” television looks like. Also, video tape of “live” television still looks “live”. But what happens in the production of television programs that causes the visual difference between, say, “The Tonight Show” (live-to-tape) and programs such as “CSI” or “JAG”? Surely these (type) programs are also “shot” with video cameras (to tape) and not photographic film, the post-production of which must now be prohibitive both in terms of cost and time.

Where/when in the process does this “style of recording” cease to have the “live” look, and why?

No, CSI and JAG are shot on 35mm film. In fact, most prime time dramas are shot on 35mm film.

Let me ratchet that up. All current prime time dramas are shot on 35mm film, and the majority of sitcoms are shot on 35mm film too.

Ah well, so much for my assumption . Thanks for the enlightenment.

Yes, you really can tell the difference between the video taped and filmed shows. The film has a grainier, but warmer look. Only the cheap sitcoms on the minor networks come to mind as shot on tape. That’s one of the reasons they look kind of cheap.

When I was growing up, most of the sitcoms seemed to be shot on tape. ALF, Head of the Class, Full House, Growing Pains, Saved by the Bell, Cosby Show, Married With Children, etc. This seemed to change in the '90s with shows like Seinfeld and Friends. I can’t think of any sitcom on the networks that’s shot on tape these days – the last one I can think of is Home Improvement. Are there any?

There are specific scenes in network dramas/sitcoms that are shot on tape, and they stick out like a sore thumb. Most recently, the final episode of Buffy the Vampire Slayer had several exterior shots (mostly involving the school bus hauling it out of Sunnydale) that were obviously shot on tape. The perspective flattened out, the color was grainier.

Some years ago on Law & Order, the episode featuring Robert Klein as a Jerry Springer/Jenny Jones-type tabloid TV host, compare the scenes of the show-within-a-show to the regular L&O shots and you can see exactly the differrence between video and film stock.

Or, look at TV ads. Lush-looking ads by national advertisers are always shot on 35 mm film, while local ads for auto dealers and contingency lawyers are always shot on videotape. The visual difference is very apparent.

Monty Python’s Flying Circus was shot using both, and it’s clear what was film (exteriors) and what was tape (interiors). They made mention of the fact when the switched from an interior shot to an exterior shot of a doorway and said, “My God – we’re surrounded by film!”

Lighting is also important.

In a studio, you can set your spots where you need them, & use gel filters to achieve effects, like enhancing the “mood” of a scene.

Can’t do that very well “live”.

I’d like to add “all daytime soap operas” to the list. Anyone second the motion?

The OP knows that the two media are different, the question is why are they different?

I would think that film with a finer resolution (grain level) should look “more live” than video (pixel resolution); yet the opposite seems to be true.

Isn’t a grain of film smaller than a pixel of typical resolution used in news video?

BwanaBob, I can’t put my finger on a source for it, but my understanding of that is that it’s an artifact precisely of the lesser resolution of your TV. Transferring film-to-TV in pre-HDTV formats means converting the 35mm (or 16mm, there were TONS of footage produced on 16mm and there still exist programs [though usually not network TV, more like documentaries and such] filmed on 16mm) analog photo-optical image (with virtually continuous color range) to a much, much lower-resolution NTSC or PAL raster scan, which results in a loss-of-information and thus the “softer” look; while “shot-on-video” already starts off at the native resolution and color-scale of the TV receiver, or pretty close to it, and thus the “sharp edge” look. (or rather, traditionally has been – nowadays with digital formats you may trick things out dramatically – now you can do a “film look” shot on digital video; also, I know there’s a resolution difference between PAL/NTSC/SECAM but relative to film it’s close enough ) .

IIRC there were a few episodes of The Twilight Zone shot on video tape. Back then I guess it was brand new. Catch a Twilight Zone marathon sometime and the video episodes look awful.

The only one I can remember was the episode where the guy was in the hospital and the doctors/nurses were commenting on how ugly he was. In the end he was “normal” and they looked like fish people.

I believe you can thank Bob Newhart for the change. If you watch the first few episodes of Newhart (the one that was set in the Vermont inn) you’ll see very clearly that it was shot on videotape. I guess they didn’t like it, because soon after, it was switched to film and remained on film for the rest of the series run.

Later episodes of Friends also make the difference very clear. Joey is a soap opera star, and they occasionally showed footage of his soap, which was shot on tape and looked very different from the actual sitcom footage, which is shot on film.

An interesting side note was one of last years MTV awards shows. They somehow managed to shoot a live show with the look of film. Not sure how they did it, though.

Pash would like to call into evidence The Larry Sanders Show.

Unfortunately, that is all.

Pash

Remember Land of the Lost? Sure its effects were crappy, but they really did a lot with a shoestring budget. This was the first show (and only that I know of) that merged video tape and film to complete a special effect.

The stop motion dinosaurs were shot on film and then the actors that interacted with were shot on video tape. Although the result of this is pretty awful because of the contrast between the visual quality, this was a technical feat because tape and film had two different frames-per-second and they had to invent a way to merge them seamlessly.

I was going to say, I was under the impression that film was usually 20 or 25 frames per second and tape was 30 frames per second. Considering that most people can detect flicker in a strobe light flashing up to 80 times per second, it probably makes a difference how “real” the picture looks.

Lumpy, the actual flicker rate is double (60hz for TV because of interlace and 48hz for film because of a double-bladed shutter), but this should have no effect on our viewing both from the same television set.

I have always been of the firm opinion that the sensitivity of the film to light is truly what makes the difference, not the grain.

I posit that the difference is in the camera and not the medium. If you record one of these previously-mentioned hybrid films (such as Monty Python’s Flying Circus) on videotape, you would still be able to easily spot the film sections and the tape sections.

In addition to differences in light capture, I can say for a fact that CCD’s have a funny sensitivity range. You can see this by pointing an infrared remote control at your video camera: the infrared light appears white on the video. I do not have the background to know the exact details of the sensitivity of modern CCDs over a range of frequencies, but the IR remote trick proves that what you see is not exactly what you get. I think the greatest weakness of present-day consumer digital photography is that colors don’t always get captured correctly: I photographed my nephew playing with some brightly-colored plastic beads and in the photo, they appeared to be neon, likely due to the extended range of sensitivity.

I’m sure that 35mm film has its own light sensitivity issues, but for some reason we find its behavior more lifelike, warm, and pleasing. In the same way that tubes sound better than transistors for certain audio uses, film looks better than tape.

For the record, Here’s a picture I just took of an infrared remote taken using a digital camera. The bluish light is the infrared LED.