Visual difference -- video vs. film

I’m sitting here watching the Twilight Zone marathon. When the episode “Long-Distance Call” came on, I immediately said, “That episode is on video, not film.” Turns out it’s one of six episodes out of the whole series that was on video.

Anyhoo, the difference between video and film is very recognizable. But what exactly is it that I am perceiving that tells me it’s one and not the other?

Contrast, depth of field, grain.

Video is 60 frames-per-second interlaced. Film is 24 fps non-interlaced.

Film also has underlying random “grain” to the negative. If you were to inspect just a single frame of film with a magnifying glass, that grain would look like tiny dirt or “noise” but when you speed it up to 24 fps motion, the individual grains disappear and “warms” up the image.

Film also has a different color curve saturation than video but this looks to be a minor difference compared to the 2 factors above.

When film pans, it can judder a little, sometimes.

When video pans, it is usually very smooth, perhaps excessively so, and sometimes it feels a little sweepy and floaty. Any lights in shot will leave a subtle trail.

Even if something is filmed, it still needs to be converted to a format which be broadcast. After all, they cannot shine the projector into the antenna. :slight_smile: Film at 24 fps doesn’t go evenly onto a 30 fps videotape, so before you even get to the broadcast stage, you’ve had to modify the original film format to a video-friendly device. It is then broadcast and displayed at 60 fps on your screen.

Someone like Johnny L.A. or Cartooniverse can explain better, but I believe video and film also have different lighting requirements as well; what is sufficient light for video might be too dark for film. In order to get a true notion of how they differ you’d have to compare side-by-side film and video of the same source.

I’ll have to come back to this. Trying to finish a file before the long weekend.

OK, let e try to answer…

First, film has grain; as Ruminator pointed out, and which I mentioned in my one-line reply. Film is a strip of clear material with an emulsion on it. The emulsion is sensitive to light. When it is exposed, it contains a ‘latent image’. Upon processing, the latent image becomes visible. The emulsion may produce colour or black-and-white images, depending on its composition. The thing about the emulsion is that it’s made up of small particles that are uniformly but randomly dispersed on the base. That is, a particle may exist at coordinates X, Y on one frame, but an identical particle isn’t in exactly the same position on the next frame. Multiply this millions of times (or however many times) across the frame, and the grains will seem to ‘swim around’ a bit. Larger grains are more sensitive to light than smaller ones, so ‘fast’ film will appear more grainy than ‘slow’ film and there will be more apparent movement or ‘noise’ (to use a video term).

Video pixels are arranged in a fixed matrix. Since there is no ‘grain’ in the film sense, you don’t get the ‘swimming’ effect on the picture. This, I think, gives video a ‘harder’ look than film.

‘Latitude’ is the degree to which film can be over- or under-exposed and still get a good image. Film has greater latitude than video, so you can get richer contrasts with film.

For any given lens, there is only one plane at which the subject is precisely in focus on the recording medium (film or electronic device). However the focus falls off gradually; so there is a ‘range’ within which the image appears to be in focus. This is called Depth of Field. When making an image, light is controlled to produce a properly-exposed image. Given a recording medium with a set sensitivity (e.g., film at a certain ASA/EI rating) light can be controlled by opening or closing the aperture, increasing or decreasing the exposure time by varying the shutter speed, varying the shutter angle, or increasing or decreasing the amount of light through the addition or subtraction of filters and/or lights. It’s the aperture that controls Depth of Field. A small aperture provides a deeper DoF, and a large aperture provides a shallower one.

In my opinion lenses for video cameras tend to be ‘not as good’ as film lenses – at least for consumer and prosumer cameras. A typical video zoom lens might cost a few hundred to a few thousand dollars. A typical zoom lens for a 16mm camera might cost ten times that. A set of used Zeiss Superspeed prime lenses (i.e., say, five non-zoom lenses) will set you back about $20,000 or more. There’s a company called Redrock Micro that makes a unit that allows prosumer-level (and pro-level) videographers use 35mm cinema lenses on today’s small production video cameras. While the aperture controls DoF for a given lens, the choice of lens also affects DoF. Basically, a longer focal length, the shallower the DoF; and the larger the recording medium, the shorter the DoF. Since the ‘chips’ in a video camera are smaller than a frame of 35mm film, video cameras tend to have a deeper DoF. Many videographers are now starting to use the Redrock unit (or similar one) to capture images with a shallower, more ‘film-like’ Dof. But not everyone does.

So the difference I (personally) see are that film has a ‘live’ quality to it due to the grain of the emulsion, film has greater latitude (and contrast) than video, and DoF seems to be more controllable on film than on video.

YMMV.

Thank you, Johnny L.A.! That was exactly the kind of detailed explanation I hoped I’d get.

Also, you need to remember that video was in it’s infancy when TZ was filmed. The differences were much more apparent than they are now - you can spot black “solarization” overexposure all through those episodes.

I mean no disrespect to Johnny L.A., especially because everything he wrote was absolutely correct. But the points he describes apply more to the differences between seeing film projected in a theater and looking at a video monitor than they explain how the OP could tell that episodes he saw on TV were shot on video rather than film. For instance, the greater dynamic range of film is essentially lost when it is converted to video. Likewise it is relatively rare (IME) for film grain to be noticeably visible when a film is broadcast.

The main difference, IMHO, that makes original video look different from film (when shown on TV) is the way video highlights the edges of objects. It is this effect (plus the 30 fps frame rate) that makes people feel video is more “real” than film.

Another factor is that video cameras in the early 1960s needed much more light than the B&W film stocks of the day, so TV studio lighting was typically much flatter than lighting for film. As a result, video images were lower contrast than film. (This is related to, but not quite the same as, the dynamic range issue Johnny L.A. mentioned.)

I don’t recall the eps the OP mentions, but a few other factors may have subtly tipped the fact that it was video: if it was shot multi-camera, the look and pacing of the editing would have been quite different than a single-camera film shoot. Also, sets for video were often notably cheaper and less realistic than film sets. Finally, at the time, video was rarely shot outside. (Think of the Monty Python bit: “Gentlemen: this room is surrounded by film!”)

So if the shows were obviously studio bound, that would provide a nearly subliminal hint that they were shot on video, entirely apart from the appearance of the image itself.

Hey, no worries. In some ways it’s comparing apples to oranges. Also I shoot 16mm, which does not have the resolution of 35mm, and changes to digital video are coming so rapidly it’s hard to keep up. It’s true that grain is usually not noticeable when transfered to video and shown on a TV screen; although it often is with faster film. I think the ‘highlighting of edges’ may go to the random distribution of emulsion particles vs. the fixed matrix of video. Although I must say I’ve seen some very good video that in isolation looks for all intents and purposes like film. (You can do a lot in post nowadays.)

Although the OP was referring to 45-year-old productions, I was speaking more generally on film vs. video. Indeed, much light was needed back then. Makeup was ‘interesting’ too. Some of it was like clown makeup when seen on film instead of video. The contrast between film and video was much more pronounced back then, especially when it came to ‘hot spots’.

Re: Monty Python. I’ve seen footage of them using Arriflex 16BL cameras. At under 20 pounds, they were much more portable than the huge video cameras of the day. That particular sketch was playing on the habit of studios to shoot video in a studio and film on location.

About film-to-video transfer:

The film is projected directly into a camera (intervening lenses take care of whatever needs to be taken care of). The 24- to 30-frame conversion is accomplished by projecting every fourth film frame twice (i.e. the video camera sees film frames 1 - 2 - 3 - 4 - 4 - 5 - 6 - 7 - 8 - 8 etc.). This would account for the jerkiness in film pans that someone mentioned.

(Of course NTSC color video operates at 29.94 frames per second. I presume they occasionally skip a repeated film frame to keep everything in sync.)

Those video TZ episodes sound different too. Sound recorded directly on video tape in a studio sounds different than film sound.

Too me the video taped episodes look more ‘real’. I know it’s psychological, but video just makes me feel more like I’m looking at the real thing than film, even though the video episodes look terrible compared to the filmed ones.

I visited my friend who had recently purchased one of these mega-resolution TVs, and noticed a movie that happened to be playing.

At first I thought that it must have been some older movie that I didn’t recognize which had been taped rather than filmed - because it gave me the same impression of video vs film as mentioned by the OP.

Imagine my surprise to find that it was “Iron Man” in maxmum HD resolution from a blu-ray disc.

Is this normal, or was it perhaps some combination of adaptation for the TV resolution or something?

The biggest difference between “film look” and “video look” is in the frame rate. As mentioned before, film generally is shot and projected at 24 frames per second. Video is essentially 60 frames per second. Video cameras shooting “progressive” mode can emulate the film look by shooting 24 frames. Close but not exactly a film replacement.

Depth of field, and in this case, narrower depth of field, can be achieved in either film or video production so that’s not really a sole characteristic of film. I think maybe some equate narrow depth of field with narrative filmmaking but that’s just style and not so much look.

Latitude, or the ability to expose an image ranging from bright portions of the frame to very dark in the same frame, really is just a characteristic of film that exceeds the current ability of video. But it doesn’t really give film a “film look”. Take for instance a composition which only spans 3 stops of exposure. In other words the bright and dark sections of the image aren’t wide ranging. Shot on film, the image doesn’t stop looking like film just because it doesn’t cover 9 stops of latitude.

Grainwise, when I shoot on film, I generally and others as well, try to achieve invisible grain. Or as close as possible.

Yep, definitely the frame rate. Years ago I was fiddling with some video footage run through a device which could freeze frames and alter the speed of playback. I got it to skip every other frame (actually field, which is a frame broken into halves of odd and even scan lines) and voila “film look”. Although at 30 frames rather than 24 but still pretty close. Everyone I showed it to thought it was 16mm film.

What format and camera do you use, and what film do you like?

Aaton for 16 and mostly Arri BLs for 35. Liking the Vision2 250D. Have you tried the Vision3, Johnny? But mostly shooting HD nowadays. Love the Varicam.

I agree with this. What I always notice in video is that small movements of people’s bodies appear much more pronounced and somehow “faster”. That has to be the frame rate.

No, I haven’t used the Kodak, except for super-8 I like Fuji 125T for the no-budget stuff. Pretty versatile. We used Fuji 250T for one indoor scene in a film (I wasn’t on the camera for that one) and it was a bit grainy. You can tell the difference. Anyway, I like the saturation with the Fuji.

For B&W I like Pan-X. Nice and clean. I have some Plus-X, but it’s a little on the grainy side.

I’ve shot with a Bolex H16-M5 and Rex 1, Krasnagorsk 3, Beaulieu R16, Eclair NPR, and Arri 16ST. I have an Aaton LTR-54 super-16, but haven’t had the opportunity to shoot with it yet. It’s two feet in front of me right now. I have a single 100-foot roll of single-perf Fuji 125T I can use in it, and lots of double-perf Fuji, Plus-X and Pan-X I can’t. I had an Arri 35BL that I was going to use for a little project, but the project I was working on at the time stalled and the studio closed. Suddenly I had a camera and nothing to use it on and no one to use it with. I traded it for the Aaton and an Arri 2B, then sold the 2B back to the guy.

I’ve been using my Panasonic AG-DVX100A. Pretty good for a non-HD camera. I think 24p footage looks like 16mm on the monitor. I’m writing a project now that I want to shoot on the Aaton in B&W.

I’ve got a K-3. Love that little camera. I’ve shot a few projects with the DVX. That’s a really impressive camera.