Film vs. video

A few questions I’ve always wondered about:

What is it about programs typically classified as “daytime TV” (especially soap operas) that gives them their distinctive look (a look also characteristic of Dr. Who and Masterpiece Theater)? I think it has something to do with using videotape instead of film, but why should this make such a difference?

And that being said, what is the difference between videotape and film, anyway?

Video lacks the contrast of film. Film has higher resolution. Film also has “grain”.

I read a quote from a filmmaker somewhere to the effect, “I could have a $10,000 video camera, and I could still make better pictures with a $100 Bolex.”

Actually (from my own website):

I don’t remember where I got that.

Lighting makes a big difference. TV in general is “flat” without much appearance of depth. This is why you’d be profoundly disappointed if you ever actually see a real TV studio. They’re small and two-dimensional. The TV camera merely gives the appearance of actual depth.

Because of this, lighting for TV is different from lighting for cinema. The typical TV lighting director (particularly on programs like talk shows, soap operas and other things that have to be produced quick and cheap) pays more attention to getting the characters lighted pretty well, and then just floods light on the rest of the set. Film cinematographers love to play with light and shadow.

For an instructional moment, look at a window in a typical TV show vs. a window in a movie. Odds are the light “outside” the window in the TV show will be about the same intensity as the light in the room. A filmmaker will go to more trouble to make his “outside” look more like actual daylight or evening.

And the “grain” in film helps, although newer recording techniques have reduced the advantage somewhat.

The last “film” I worked on was lit as if it were being shot on film. There is still a definite difference. But yeah, good lighting makes for a good film – or video!

I had a link that compared the resolution of super-16 film with digital video. The film still has higher resolution. Also, consider that video is comprised of horizontal lines. Film records its image on randomly arranged grains of emulsion.

Regarding “outside” and “inside” light, tungsten lights and sunlight are different “colour temperatures”. The colour temperature of sunlight is 5,500°K, while tungsten is 3,200°K or 3,400°K. If you use tungsten-balanced film outside, your film will be blue-ish unless you use an orange filter (at the cost of 2/3 stop). If you use daylight-balanced film under tungsten light, it will appear brownish unless you use a blue filter (at the cost of 2 stops).

One technique for balancing indoor lights and sunlight (e.g., when you have sunlight coming in through a window) is to put a #85 orange gel over the window. This can be done for video as well as film, as the white balance on video is analogous to the colour balance of different films. We were shooting a scene on the last project where the camera was inside looking out of an open door. We could not use a gel because A) we didn’t have one big enough; B) we couldn’t afford one; and C) it would have been seen in the shot. So we put blue filters over the lights and balanced the video for daylight. It worked.

Film vs. Videotape thread #1
Film vs. Videotape thread #2
Film vs. Videotape thread #3
Film vs. Videotape thread #4

Of interest: the BBC used to always film indoor scenes with video tape and outdoor scenes on film (not sure why tho’). There’s a great Monty Python sketch where they make mock the practice:

Things that contribute to the different look of video and film:

The aperature plate (the rectangle in which the image is caught) is much smaller inside a television camera than in a 35mm film camera. The smaller the aperature area, the more “depth of field” the image has. Depth of field is the plane of the picture that is in focus.

Television cameras in the U.S. record 30 pictures per second (60 scans per second); film cameras record 24 pictures per second. Thus, movement captured on video is smoother than on film because it is broken down into more discrete elements, with less blur.

I hope you don’t mind a hijack, but this seems like a close enough topic.

Why do deleted scenes on DVDs look so atrocious? What in the world do they do to the picture to make it go from that to film/DVD quality?

“Deleted scenes” might be a bit of a misnomer. They’re more likely scens that were never in the film in the first place, but were shot because they might have been used. At some point in the editing process, they decided that the footage would not be used.

Look in the credits, and you’ll see someone called the “colour timer”. When a film is in the editing stage, the filmmakers might use a “one-light pass”. That is, the workprint is not colour corrected or adjusted for different lighting situations. They’re only using the footage for editing, and it’s cheaper that way. (Low-budget filmmakers sometimes just get a B&W workprint.) Once the film is edited, the “negative cutter” conforms the negative to match the workprint. Then a master print is made from the negative, using more care than for the workprint. When “deleted scenes” are included on a DVD, they are often made from the workprint.