What is the difference between videotape and film?

Not the physical difference, but the “philosophical” difference.

Soaps are typically shot on video tape. Some comedy programs over the years also.

What it the “artistic” reason (or whatever other reason) for choosing videotape as opposed to film for the medium.

Film seems to look a bit “cleaner” in the finished product.

Well, price is a major concern. Videotape is extremely cheap compared to film. Of course, there’s been a lot of sitcoms shot on film (MASH, Cheers, Newhart, The Bob Newhart Show, etc), but I don’t think there’s ever been a one-hour drama shot on video. Film is a LOT more flattering than video and there’s more flexibilty in editing (excluding the newer trend of using computerized video editors).

Film requires developing, which is expensive in its own right but also has the big disadvantage that if a scene doesn’t look right on screen, or worse some technical problem damages the film, it has to be re-shot the next day. With video tape, the raw footage can be examined immediately after the shot and a re-shoot can be done immediately if necessary while the actors are still in costume and makeup and the set is still standing. Video tape is also easy to copy, where film requires specialized equipment.

Another reason to shoot on film is that it is intrinsically a very high resolution medium. You always want your originals to be the highest quality possible.

Video tape has a cleaner picture on television. This should be no surprise since VT was ment for the medium.
In North America our TV’s flash along electronically at 30 frames per second. Video tape records and plays back the pictures that way–30 frames.
In North America most motion picture film moves at 24 frames per second. This makes for a mismatch when projecting films on TV. To compensate for the different speeds, projectors at television stations would hold something like the third or fourth frame for an extra electronic scan (this was in a textbook at college, do you really want to know?) to even the action out.

*The untrained eye can tell the difference between videotape and film – this is important for what comes up next. *

After videotape equipment was introduced by the AMPEX corporation (I seem to recall the first public use of videotape was to rebroadcast Douglas Edwards and the News to the West Coast time zone) the soap operas jumped onto it. Videotape was custom made for TV, was cheap to use and looked as good as live broadcasts.
On the other hand motion picture film may have had a slightly degraded picture as compared to videotape, but it was still a FILM. You could see great plots and great actors and actresses on NBC’s “Saturday Night At The Movies.”
It didn’t take long for a lot of the general public associated the “look” of videotape (cleaner though it was) with low budget material.
That perception might be a part of your philosophical question.

I was watching “Escape from Mars” on cable and thought it looked very odd; very cheesy. I couldn’t put my finger on it, but IMDB said it was shot on video tape…

What exactly would have led me to notice the difference?

Possibly the answer might be to do with the fact that when it was first inroduced video did not have as good a resolution as film but was much cheaper.

With film having an established history and tradition I guess video was looked upon as a poor producers medium, especially since it was used in ‘low brow’ productions such as soap opera and advertising on tv where low cost is important.

It is human nature to stack things hierarchically, some might say snobbish but I think in this case there is an element of that.

I would be possible to manufacture video sytems that have a very much higher resolution, U-Matic is taken as a proffessional standard and that has been around a good 15 to 20 years, in that time things have moved on enormously, we could do way better but it would not be economic.

It will be interesting to see what happens if HDTV ever takes off.

Possibly the answer might be to do with the fact that when it was first inroduced video did not have as good a resolution as film but was much cheaper

Actually, film has always had a much, much higher resolution than broadcast video. NTSC video has 525 lines of resolution (486 usable) vertically. The scanlines are typically thought of as being 720 pixels wide (again, this is NTSC; PAL/SECAM has slight different [slightly better] resolution).

Film, being a chemical, not electronic medium, doesn’t have resolution per se, but compositing for digital effects in movies is typically done at resolutions of 1000x1000 to up to 4000x4000 pixels, depending on how finicky the shot is. 2000^2 pixels is a reasonable definition for ‘film res’.

Other differences between film and video:

  • Film has a much better contrast ratio than video, e.g. the number of gradations between full-black and full-white is much greater, and blacks are indeed black, where on video they’re a deep charcoal grey. This makes film look more ‘vivid’, even when film is transferred to video. This is why most (non-soap-opera) television programs are shot on film, then transferred (telecine’d) to video for broadcast.

  • Video has much better motion reproduction than film. Film is shot at 24 frames per second, where NTSC video is 29.97 (not 30) fps. But the video fps number is misleading: since video is actually interlaced, not full-frame, the actual scan-rate for video is 59.94 fps, so motion in video is redrawn nearly 2.5 times faster than on film.

Some of these differences will be reduced when HD is popular. The 1080p HD format approaches film resolution, although film will probably always have a better b/c ratio than videotape. HD can also optionally run at 24fps or video frame rates.

The OP said “What it the “artistic” reason” he doesn’t want the physical reasons…

e.g. using film there is more control over the dramatic emotional impact.

Interesting response, but this non-techie finds film so much more pleasing to the eye–the color saturation, the sheer quality of the image beats videotape any day.

That said, I’ve seen old Superbowl games that I’m guessing where shot on film. Didn’t look good compared with today’s live broadcasts. Watching an old Steelers superbowl game looks like something from the paleolithic era.

I was going to mention contrast, but sqeegee beat me to it.

About video frame rate (and I’ll use 30fps since it’s easier to use than 29.97): A television screen in the U.S. operates on 60hz alternating current. The frame rate should be 60fps. but only half of the lines are scanned at a time. That is, the tube “draws” lines 1, 3, 5, 7 and so on, then goes back and draws lines 2, 4, 6, 8 and so on. That makes a frame rate of 30fps, not 60. European televisions operate at 50hz and the standard film frame rate is 25fps; so it’s convenient for them that 25 goes evenly into 50, as opposed to 24 going into 30.

As other posters mentioned, video seems “flat”. Film has “grain” which is formed by the chemicals making up the emulsion. There are “fast” films (which need less light to make an image) and “slow” films (which need more light to make an image). Fast films have more grain than slow films, which can change the “feel” of a film. If you want a “gritty” look, use fast film.

Video can be used effectively. Lars von Trier has been using digital video with much success. Some films are shot on video and then transferred to film for distribution. This gives you the inexpensive production costs of video with the grain of film. I’ve seen footage shot on DV that was transferred to 35mm film and it looks great! But it’s expensive to get the best product. The savings gained by using video can be lost in the transfer process.

That being said, the cost of film stock is usually not that great compared to the cost of the overall production. Of course if you have more money, you’re more likely to shoot more takes to make sure you have several to choose from. And there is the cost of developing, duplication etc. But when you have a $20 million budget, the film stock is not as great a concern as when you have a $40,000 budget.

–It’s likely that the film was 16 mm. You could tell the difference in the film stock used at that time.
Modern film apparently is better. “Dr. Quinn, Medicine Woman” was filmed on 16 mm stock (as are many commercials). The studio said that film allowed for more flexible editing and the appearance was good enough for a standard television picture. They did admit that with the advent of HDTV the use of 16 mm film would become too apparent to the average viewer.

First, to answer the O.P. Part of it is pure budgeting, part is asthetics. It is actually easier and faster to shoot on location with film. Instead of doing an entire remote set-up each time you get to a location, your gear is confined to the hand carts and dolly/crane/jib/Steadicam that is being used for that set-up. Live remote video jobs are by and large a huge pain in the ass. As for costs, it’s not just the cost of 4 minutes of 35mm film ( 400 feet ) opposed to 11 minutes of 16mm film ( 400 feet ) opposed to 4 OR 11 minutes of videotape stock. It’s the support systems needed for each choice. You need more electronic and maintenance support for a large video production than you do for a film shoot.

I’ve done Sex and The City. It’s a fairly large, very well funded show. It’s a hit. They shoot it in 16mm. The cameras are lighter, and you don’t have to reload as often ( see above ). You can move faster and do more work per day with a film unit, IMHO. While live television may generate a few hours of material straight in the case of a sports event or live awards show, MOST of the time, the set-up and tear-down time is incredible compared to the wrap time on a set, even a large t.v. or feature set.

The negative is transferred directly into a computer editing system. Work prints are not struck, those went the way of the do-do bird at least ten years ago. ( Let’s not get into student films, or documentaries here. I am well aware that one can still cut on film). In fact, what Mr Blue Sky said here

is exactly opposite of the truth. Virtually all professional jobs are cut on computer these days. Flexibility in film editing? Steenbeck and KEM and Movieola are about it for film. Every Tom, Dick and Harry makes a computer based editor now.

Now we get into the debate of artistic merits. It’s a tougher debate now that High Definition shows are being produced on at least a semi-regular basis. I’m more fond of the aspect ratio of Hi-Def than I am of the look. The depth of field is funky, and it’s merciless on focus. I prefer film on a personal level because it is- beginning to end- an organic process.

You shoot a living thing (forget dreck like “Shreck” for the moment, mkay?) in front of lights, and capture the images on a light sensitive material. Use chemicals, make a medium through which you pass light in order to view those same images. It’s satisfying on many levels. The resolving capabilities of film are still daunting. It also just has a different taste to it. I’ve used a lot of different filtration, adjustments, skin level tweaking, etc- all to try to get the film look. There is a device that “emulsifies” video tape shots, it adds in the grain patterning inherint in film. Or, at least, inherint in older film. Now the film stock is a lot sharper than it used to be.

Videotape, or digital storage and recording, removes the organics. To me, it’s MUCH harder to light someone for videotape and make it look like something other than soap opera shit, than it is for film. Videotape doesn’t have a sense of depth the way film does ( Subjectively speaking again ). While it is true that one can shoot DigiBeta, or another digital medium and then cut on computer, then master it to a Terrabyte disk and video project it and NEVER LOSE A GENERATION from the original shoot day, the overall quality is still lower than that of film.

I don’t mean to be a “film snob” here, it’s just how I feel. Lighting Directors and D.P.'s who can do wonders with video shows have my utmost respect. They’re murder. Adding just a little taste of a light here and there is a skill, just as operating a shot well is a skill.

( Shit, catering so that the jalapeno poppers are fresh when we break is a skill… )

Johnny LA is right, tape-to-film costs are brutal as hell. It’s an interesting look, though…and for some,a choice made for purely aesthetic and not budgetary reasons.

Cartooniverse

[hijack]
Doug Bowe:

OK, question: you know those OLD films from back in the silent era? How, when we see clips from them broadcast on TV or embedded as footage within modern film, everything is always speeded up so adults walk along with the rapid jerky motion of toddlers and so on?

It was explained to me at one point that, no, it wasn’t the fad back then to film things in such a silly way, it was because those old old films were shot at fewer frames per second, so when they show them (or clips from them) on modern equipment, you see it speeded up as described.

And I thought, at the time, “That’s stupid, why don’t they adjust the frame rate so it looks normal when being shown on modern equipment?” …but then I figured maybe that was more difficult than I realized.

Now, in light of what you say they do when showing film on a television station, I have to ask what I originally thought: Why the bloody hell do we always see those old old films being shown at 4/3 or 3/2 the speed at which they were originally filmed?

[/hijack]

AHunter 3:

Cecil addressed the speeded-up silent movie question in

http://www.straightdope.com/classics/a4_067.html

The main reason most old silents and newsreels are shown at the wrong speed is that people don’t have the proper equipment to show them at the correct speed, and (as Cecil explains) adding extra frames to the film so it can be shown correctly on standard equipment is an expensive PITA.

Of course, this may change in the digital era. Adding the extra frames needed could now be done with a touch of a button, if you were creating a new digital master.

On the other hand, by now most people believe that old movies are supposed to look jerky and sped up. So don’t expect a rush to fix the problem.

Cartooniverse

The ‘depth’ of a moving visual image is related to the range of frequencies available to carry all the visual information.

Its the equivalent in audio terms of dynamic range, or maybe in simpler terms comparing AM with FM bradcasts.

The reason that the range of information on electronic vidoe is restricted is to reduce the transmission bandwith since video was primarily aimed at tv .

You could make an electronic system that would be suitable for video mastering, with a much greater bandwidth but it would be very specialised and so cost a fortune.

Johnny LA:

About video frame rate (and I’ll use 30fps since it’s easier to use than 29.97): A television screen in the U.S. operates on 60hz alternating current. The frame rate should be 60fps. but only half of the lines are scanned at a time. That is, the tube “draws” lines 1, 3, 5, 7 and so on, then goes back and draws lines 2, 4, 6, 8 and so on. That makes a frame rate of 30fps, not 60.

We’re both saying almost the same thing.

Yes, it’s true that you can think of the video frame rate as 30 [or 29.97] fps, but remember: half the scanlines are offset in time by 1/2 of a frame time or 1/59.94th of a second.

So, while the theoretical frame rate for detail is 30fps, the amount of motion seen is effectively 60fps [or 59.94].

To contrast this with film: there’s no scanlines in film and no fields – the entire frame appears at once on the screen. So a film frame rate of 24fps is really 24fps.

I remember specifically that “Cheers” was shot in film by to “give it a warmer mood”. The grainy quality conveyed a bar interior better than a more glossy tape image.
Videotape also ages poorly. A taped show from 1971 looks very “dated”, while a high quality film such as the “Godfather” still looks “modern” nearly 30 years later.

On the less technical, more visual side squeegee pointed out the two main things which make video and film look different to the human eye: contrast and motion.

Because video has such a smaller contrast ratio than film it makes things look more stark. Colors all look very primary. This actually adds to the ‘real world’ look that video has compared to film by taking away a visual aesthetic and simplifying the picture.

The different frame rates make a huge difference. Because video is faster all motion has a much less smooth, strobing kind of look to it. Again, this makes video look more ‘real’ because motion blur isn’t as prevelant in real life as it is on film.

Regardless of what some may claim, until HDTV arrives it is an undisputable fact that film provides a more professional (and more expensive) look than video.

A good way to judge the differences between them is to watch the (very few) shows that used both. Monty Python’s Flying Circus and Fawlty Towers both always shot indoor scenes on video and outdoor ones on film. The reason being that it rains alot in Britain and back in the 70s portable video equipment was very intolerant of moisture.

Another good example was The Larry Sanders Show on HBO. When we were supposed to be watching the fictional Larry Sanders talk show we saw it through the point of view of the studio cameras (on video). When it was the show about the show it was film.

Another thing to mention is a process called ‘Filmlook’ which makes video look more like film. Filmlook is actually only one of several companies that provide this service. Sometimes it’s very good (the kid’s show Beekman’s World looked perfect) and sometimes it’s not (The John Laroquette Show looked horrible).