The Straight Dope

Go Back   Straight Dope Message Board > Main > General Questions

Reply
 
Thread Tools Display Modes
  #1  
Old 06-16-2001, 09:38 PM
Earl Snake-Hips Tucker Earl Snake-Hips Tucker is offline
Charter Member
 
Join Date: Jul 1999
Location: South Carolina
Posts: 12,611
Not the physical difference, but the "philosophical" difference.

Soaps are typically shot on video tape. Some comedy programs over the years also.

What it the "artistic" reason (or whatever other reason) for choosing videotape as opposed to film for the medium.
Reply With Quote
Advertisements  
  #2  
Old 06-16-2001, 10:09 PM
Gunslinger Gunslinger is offline
Guest
 
Join Date: Jun 2000
Film seems to look a bit "cleaner" in the finished product.
Reply With Quote
  #3  
Old 06-16-2001, 10:09 PM
Mr. Blue Sky Mr. Blue Sky is offline
Guest
 
Join Date: May 1999
Well, price is a major concern. Videotape is extremely cheap compared to film. Of course, there's been a lot of sitcoms shot on film (MASH, Cheers, Newhart, The Bob Newhart Show, etc), but I don't think there's ever been a one-hour drama shot on video. Film is a LOT more flattering than video and there's more flexibilty in editing (excluding the newer trend of using computerized video editors).
Reply With Quote
  #4  
Old 06-16-2001, 10:36 PM
aramis aramis is offline
Guest
 
Join Date: May 2000
Film requires developing, which is expensive in its own right but also has the big disadvantage that if a scene doesn't look right on screen, or worse some technical problem damages the film, it has to be re-shot the next day. With video tape, the raw footage can be examined immediately after the shot and a re-shoot can be done immediately if necessary while the actors are still in costume and makeup and the set is still standing. Video tape is also easy to copy, where film requires specialized equipment.
Reply With Quote
  #5  
Old 06-16-2001, 10:42 PM
SmackFu SmackFu is offline
Guest
 
Join Date: Sep 2000
Another reason to shoot on film is that it is intrinsically a very high resolution medium. You always want your originals to be the highest quality possible.
Reply With Quote
  #6  
Old 06-16-2001, 11:56 PM
Doug Bowe Doug Bowe is offline
Charter Member
 
Join Date: Mar 1999
Location: El Paso, TX, USA
Posts: 2,745
Video tape has a cleaner picture on television. This should be no surprise since VT was ment for the medium.
In North America our TV's flash along electronically at 30 frames per second. Video tape records and plays back the pictures that way--30 frames.
In North America most motion picture film moves at 24 frames per second. This makes for a mismatch when projecting films on TV. To compensate for the different speeds, projectors at television stations would hold something like the third or fourth frame for an extra electronic scan (this was in a textbook at college, do you really want to know?) to even the action out.

The untrained eye can tell the difference between videotape and film -- this is important for what comes up next.

After videotape equipment was introduced by the AMPEX corporation (I seem to recall the first public use of videotape was to rebroadcast Douglas Edwards and the News to the West Coast time zone) the soap operas jumped onto it. Videotape was custom made for TV, was cheap to use and looked as good as live broadcasts.
On the other hand motion picture film may have had a slightly degraded picture as compared to videotape, but it was still a FILM. You could see great plots and great actors and actresses on NBC's "Saturday Night At The Movies."
It didn't take long for a lot of the general public associated the "look" of videotape (cleaner though it was) with low budget material.
That perception might be a part of your philosophical question.
Reply With Quote
  #7  
Old 06-17-2001, 01:34 AM
MattTheCroc MattTheCroc is offline
Guest
 
Join Date: Jun 2000
I was watching "Escape from Mars" on cable and thought it looked very odd; very cheesy. I couldn't put my finger on it, but IMDB said it was shot on video tape...

What exactly would have led me to notice the difference?
Reply With Quote
  #8  
Old 06-17-2001, 02:18 AM
casdave casdave is offline
Member
 
Join Date: Mar 2000
Posts: 7,550
Possibly the answer might be to do with the fact that when it was first inroduced video did not have as good a resolution as film but was much cheaper.

With film having an established history and tradition I guess video was looked upon as a poor producers medium, especially since it was used in 'low brow' productions such as soap opera and advertising on tv where low cost is important.

It is human nature to stack things hierarchically, some might say snobbish but I think in this case there is an element of that.

I would be possible to manufacture video sytems that have a very much higher resolution, U-Matic is taken as a proffessional standard and that has been around a good 15 to 20 years, in that time things have moved on enormously, we could do way better but it would not be economic.

It will be interesting to see what happens if HDTV ever takes off.
Reply With Quote
  #9  
Old 06-17-2001, 09:05 AM
squeegee squeegee is online now
Member
 
Join Date: Dec 2000
Location: Gilroy CA
Posts: 8,261
Possibly the answer might be to do with the fact that when it was first inroduced video did not have as good a resolution as film but was much cheaper

Actually, film has always had a much, much higher resolution than broadcast video. NTSC video has 525 lines of resolution (486 usable) vertically. The scanlines are typically thought of as being 720 pixels wide (again, this is NTSC; PAL/SECAM has slight different [slightly better] resolution).

Film, being a chemical, not electronic medium, doesn't have resolution per se, but compositing for digital effects in movies is typically done at resolutions of 1000x1000 to up to 4000x4000 pixels, depending on how finicky the shot is. 2000^2 pixels is a reasonable definition for 'film res'.

Other differences between film and video:

- Film has a much better contrast ratio than video, e.g. the number of gradations between full-black and full-white is much greater, and blacks are indeed black, where on video they're a deep charcoal grey. This makes film look more 'vivid', even when film is transferred to video. This is why most (non-soap-opera) television programs are shot on film, then transferred (telecine'd) to video for broadcast.

- Video has much better motion reproduction than film. Film is shot at 24 frames per second, where NTSC video is 29.97 (not 30) fps. But the video fps number is misleading: since video is actually interlaced, not full-frame, the actual scan-rate for video is 59.94 fps, so motion in video is redrawn nearly 2.5 times faster than on film.

Some of these differences will be reduced when HD is popular. The 1080p HD format approaches film resolution, although film will probably always have a better b/c ratio than videotape. HD can also optionally run at 24fps or video frame rates.
Reply With Quote
  #10  
Old 06-17-2001, 09:18 AM
handy handy is offline
BANNED
 
Join Date: Mar 1999
Location: Pacific Grove, Calif
Posts: 17,493
The OP said "What it the "artistic" reason" he doesn't want the physical reasons....

e.g. using film there is more control over the dramatic emotional impact.
Reply With Quote
  #11  
Old 06-17-2001, 11:59 AM
Carnac the Magnificent! Carnac the Magnificent! is offline
Guest
 
Join Date: Jan 2001
Interesting response, but this non-techie finds film so much more pleasing to the eye--the color saturation, the sheer quality of the image beats videotape any day.

That said, I've seen old Superbowl games that I'm guessing where shot on film. Didn't look good compared with today's live broadcasts. Watching an old Steelers superbowl game looks like something from the paleolithic era.
Reply With Quote
  #12  
Old 06-17-2001, 12:23 PM
Johnny L.A. Johnny L.A. is offline
Charter Member
 
Join Date: Jan 2000
Location: NoWA
Posts: 49,049
I was going to mention contrast, but sqeegee beat me to it.

About video frame rate (and I'll use 30fps since it's easier to use than 29.97): A television screen in the U.S. operates on 60hz alternating current. The frame rate should be 60fps. but only half of the lines are scanned at a time. That is, the tube "draws" lines 1, 3, 5, 7 and so on, then goes back and draws lines 2, 4, 6, 8 and so on. That makes a frame rate of 30fps, not 60. European televisions operate at 50hz and the standard film frame rate is 25fps; so it's convenient for them that 25 goes evenly into 50, as opposed to 24 going into 30.

As other posters mentioned, video seems "flat". Film has "grain" which is formed by the chemicals making up the emulsion. There are "fast" films (which need less light to make an image) and "slow" films (which need more light to make an image). Fast films have more grain than slow films, which can change the "feel" of a film. If you want a "gritty" look, use fast film.

Video can be used effectively. Lars von Trier has been using digital video with much success. Some films are shot on video and then transferred to film for distribution. This gives you the inexpensive production costs of video with the grain of film. I've seen footage shot on DV that was transferred to 35mm film and it looks great! But it's expensive to get the best product. The savings gained by using video can be lost in the transfer process.

That being said, the cost of film stock is usually not that great compared to the cost of the overall production. Of course if you have more money, you're more likely to shoot more takes to make sure you have several to choose from. And there is the cost of developing, duplication etc. But when you have a $20 million budget, the film stock is not as great a concern as when you have a $40,000 budget.
__________________
'Never say "no" to adventure. Always say "yes". Otherwise you'll lead a very dull life.' -- Commander Caractacus Pott, R.N. (Retired)

'Do not act incautiously when confronting a little bald wrinkly smiling man.' -- Lu-Tze
Reply With Quote
  #13  
Old 06-17-2001, 01:08 PM
Doug Bowe Doug Bowe is offline
Charter Member
 
Join Date: Mar 1999
Location: El Paso, TX, USA
Posts: 2,745
Quote:
Originally posted by Country Squire
...I've seen old Superbowl games that I'm guessing where shot on film. Didn't look good compared with today's live broadcasts. Watching an old Steelers superbowl game looks like something from the paleolithic era.

--It's likely that the film was 16 mm. You could tell the difference in the film stock used at that time.
Modern film apparently is better. "Dr. Quinn, Medicine Woman" was filmed on 16 mm stock (as are many commercials). The studio said that film allowed for more flexible editing and the appearance was good enough for a standard television picture. They did admit that with the advent of HDTV the use of 16 mm film would become too apparent to the average viewer.
Reply With Quote
  #14  
Old 06-17-2001, 01:37 PM
Cartooniverse Cartooniverse is offline
Charter Member
 
Join Date: Oct 1999
Location: Betwixt My Ears
Posts: 11,271
Quote:
Originally posted by Mjollnir
Not the physical difference, but the "philosophical" difference.
Soaps are typically shot on video tape. Some comedy programs over the years also.
What it the "artistic" reason (or whatever other reason) for choosing videotape as opposed to film for the medium.
First, to answer the O.P. Part of it is pure budgeting, part is asthetics. It is actually easier and faster to shoot on location with film. Instead of doing an entire remote set-up each time you get to a location, your gear is confined to the hand carts and dolly/crane/jib/Steadicam that is being used for that set-up. Live remote video jobs are by and large a huge pain in the ass. As for costs, it's not just the cost of 4 minutes of 35mm film ( 400 feet ) opposed to 11 minutes of 16mm film ( 400 feet ) opposed to 4 OR 11 minutes of videotape stock. It's the support systems needed for each choice. You need more electronic and maintenance support for a large video production than you do for a film shoot.

I've done Sex and The City. It's a fairly large, very well funded show. It's a hit. They shoot it in 16mm. The cameras are lighter, and you don't have to reload as often ( see above ). You can move faster and do more work per day with a film unit, IMHO. While live television may generate a few hours of material straight in the case of a sports event or live awards show, MOST of the time, the set-up and tear-down time is incredible compared to the wrap time on a set, even a large t.v. or feature set.

The negative is transferred directly into a computer editing system. Work prints are not struck, those went the way of the do-do bird at least ten years ago. ( Let's not get into student films, or documentaries here. I am well aware that one can still cut on film). In fact, what Mr Blue Sky said here
Quote:
there's more flexibilty in editing (excluding the newer trend of using computerized video editors).
is exactly opposite of the truth. Virtually all professional jobs are cut on computer these days. Flexibility in film editing? Steenbeck and KEM and Movieola are about it for film. Every Tom, Dick and Harry makes a computer based editor now.

Now we get into the debate of artistic merits. It's a tougher debate now that High Definition shows are being produced on at least a semi-regular basis. I'm more fond of the aspect ratio of Hi-Def than I am of the look. The depth of field is funky, and it's merciless on focus. I prefer film on a personal level because it is- beginning to end- an organic process.

You shoot a living thing (forget dreck like "Shreck" for the moment, mkay?) in front of lights, and capture the images on a light sensitive material. Use chemicals, make a medium through which you pass light in order to view those same images. It's satisfying on many levels. The resolving capabilities of film are still daunting. It also just has a different taste to it. I've used a lot of different filtration, adjustments, skin level tweaking, etc- all to try to get the film look. There is a device that "emulsifies" video tape shots, it adds in the grain patterning inherint in film. Or, at least, inherint in older film. Now the film stock is a lot sharper than it used to be.

Videotape, or digital storage and recording, removes the organics. To me, it's MUCH harder to light someone for videotape and make it look like something other than soap opera shit, than it is for film. Videotape doesn't have a sense of depth the way film does ( Subjectively speaking again ). While it is true that one can shoot DigiBeta, or another digital medium and then cut on computer, then master it to a Terrabyte disk and video project it and NEVER LOSE A GENERATION from the original shoot day, the overall quality is still lower than that of film.

I don't mean to be a "film snob" here, it's just how I feel. Lighting Directors and D.P.'s who can do wonders with video shows have my utmost respect. They're murder. Adding just a little taste of a light here and there is a skill, just as operating a shot well is a skill.

( Shit, catering so that the jalapeno poppers are fresh when we break is a skill........ )

Johnny LA is right, tape-to-film costs are brutal as hell. It's an interesting look, though....and for some,a choice made for purely aesthetic and not budgetary reasons.

Cartooniverse
__________________
If you want to kiss the sky you'd better learn how to kneel.
Reply With Quote
  #15  
Old 06-17-2001, 01:54 PM
AHunter3 AHunter3 is offline
Charter Member
 
Join Date: Mar 1999
Location: NY (Manhattan) NY USA
Posts: 16,298
[hijack]
Doug Bowe:

Quote:
In North America our TV's flash along electronically at 30 frames per second. Video tape records and plays back the pictures that way--30 frames. In North America most motion picture film moves at 24 frames per second. This makes for a mismatch when projecting films on TV. To compensate for the different speeds, projectors at television stations would hold something like the third or fourth frame for an extra electronic scan
OK, question: you know those OLD films from back in the silent era? How, when we see clips from them broadcast on TV or embedded as footage within modern film, everything is always speeded up so adults walk along with the rapid jerky motion of toddlers and so on?

It was explained to me at one point that, no, it wasn't the fad back then to film things in such a silly way, it was because those old old films were shot at fewer frames per second, so when they show them (or clips from them) on modern equipment, you see it speeded up as described.

And I thought, at the time, "That's stupid, why don't they adjust the frame rate so it looks normal when being shown on modern equipment?" ...but then I figured maybe that was more difficult than I realized.

Now, in light of what you say they do when showing film on a television station, I have to ask what I originally thought: Why the bloody hell do we always see those old old films being shown at 4/3 or 3/2 the speed at which they were originally filmed?

[/hijack]
__________________
Disable Similes in this Post
Reply With Quote
  #16  
Old 06-17-2001, 02:09 PM
Wumpus Wumpus is offline
Guest
 
Join Date: Jun 1999
AHunter 3:

Cecil addressed the speeded-up silent movie question in

http://www.straightdope.com/classics/a4_067.html

The main reason most old silents and newsreels are shown at the wrong speed is that people don't have the proper equipment to show them at the correct speed, and (as Cecil explains) adding extra frames to the film so it can be shown correctly on standard equipment is an expensive PITA.

Of course, this may change in the digital era. Adding the extra frames needed could now be done with a touch of a button, if you were creating a new digital master.

On the other hand, by now most people believe that old movies are *supposed* to look jerky and sped up. So don't expect a rush to fix the problem.
Reply With Quote
  #17  
Old 06-17-2001, 02:15 PM
casdave casdave is offline
Member
 
Join Date: Mar 2000
Posts: 7,550
Cartooniverse

The 'depth' of a moving visual image is related to the range of frequencies available to carry all the visual information.

Its the equivalent in audio terms of dynamic range, or maybe in simpler terms comparing AM with FM bradcasts.

The reason that the range of information on electronic vidoe is restricted is to reduce the transmission bandwith since video was primarily aimed at tv .

You could make an electronic system that would be suitable for video mastering, with a much greater bandwidth but it would be very specialised and so cost a fortune.
Reply With Quote
  #18  
Old 06-17-2001, 02:49 PM
squeegee squeegee is online now
Member
 
Join Date: Dec 2000
Location: Gilroy CA
Posts: 8,261
Johnny LA:

About video frame rate (and I'll use 30fps since it's easier to use than 29.97): A television screen in the U.S. operates on 60hz alternating current. The frame rate should be 60fps. but only half of the lines are scanned at a time. That is, the tube "draws" lines 1, 3, 5, 7 and so on, then goes back and draws lines 2, 4, 6, 8 and so on. That makes a frame rate of 30fps, not 60.

We're both saying almost the same thing.

Yes, it's true that you can think of the video frame rate as 30 [or 29.97] fps, but remember: half the scanlines are offset in time by 1/2 of a frame time or 1/59.94th of a second.

So, while the theoretical frame rate for detail is 30fps, the amount of motion seen is effectively 60fps [or 59.94].

To contrast this with film: there's no scanlines in film and no fields -- the entire frame appears at once on the screen. So a film frame rate of 24fps is really 24fps.
Reply With Quote
  #19  
Old 06-17-2001, 03:49 PM
jaimest jaimest is offline
Guest
 
Join Date: Dec 2000
I remember specifically that "Cheers" was shot in film by to "give it a warmer mood". The grainy quality conveyed a bar interior better than a more glossy tape image.
Videotape also ages poorly. A taped show from 1971 looks very "dated", while a high quality film such as the "Godfather" still looks "modern" nearly 30 years later.
Reply With Quote
  #20  
Old 06-17-2001, 03:50 PM
Hail Ants Hail Ants is offline
Charter Member
 
Join Date: Jan 2000
Location: NY USA
Posts: 5,474
On the less technical, more visual side squeegee pointed out the two main things which make video and film look different to the human eye: contrast and motion.

Because video has such a smaller contrast ratio than film it makes things look more stark. Colors all look very primary. This actually adds to the 'real world' look that video has compared to film by taking away a visual aesthetic and simplifying the picture.

The different frame rates make a huge difference. Because video is faster all motion has a much less smooth, strobing kind of look to it. Again, this makes video look more 'real' because motion blur isn't as prevelant in real life as it is on film.

Regardless of what some may claim, until HDTV arrives it is an undisputable fact that film provides a more professional (and more expensive) look than video.

A good way to judge the differences between them is to watch the (very few) shows that used both. Monty Python's Flying Circus and Fawlty Towers both always shot indoor scenes on video and outdoor ones on film. The reason being that it rains alot in Britain and back in the 70s portable video equipment was very intolerant of moisture.

Another good example was The Larry Sanders Show on HBO. When we were supposed to be watching the fictional Larry Sanders talk show we saw it through the point of view of the studio cameras (on video). When it was the show about the show it was film.

Another thing to mention is a process called 'Filmlook' which makes video look more like film. Filmlook is actually only one of several companies that provide this service. Sometimes it's very good (the kid's show Beekman's World looked perfect) and sometimes it's not (The John Laroquette Show looked horrible).
Reply With Quote
  #21  
Old 06-17-2001, 03:57 PM
Hail Ants Hail Ants is offline
Charter Member
 
Join Date: Jan 2000
Location: NY USA
Posts: 5,474
Quote:
Videotape also ages poorly. A taped show from 1971 looks very "dated", while a high quality film such as the "Godfather" still looks "modern" nearly 30 years later
It's not that the physical tape itself has deteriated but that video recording technology was much worse in the past.
Reply With Quote
  #22  
Old 06-17-2001, 04:12 PM
Johnny L.A. Johnny L.A. is offline
Charter Member
 
Join Date: Jan 2000
Location: NoWA
Posts: 49,049
Quote:
The reason being that it rains alot in Britain and back in the 70s portable video equipment was very intolerant of moisture.
I've often wondered why they did that. I was thinking of posting it as a GQ.
Reply With Quote
  #23  
Old 06-17-2001, 04:18 PM
KneadToKnow KneadToKnow is offline
Voodoo Adult (Slight Return)
Charter Member
 
Join Date: Jul 2000
Location: Charlotte, NC, USA
Posts: 24,087
Quote:
Originally posted by Hail Ants
Another good example was The Larry Sanders Show on HBO. When we were supposed to be watching the fictional Larry Sanders talk show we saw it through the point of view of the studio cameras (on video). When it was the show about the show it was film.
I can't verify the technical accuracy of this, but I can definitely say that whatever they did worked. You could always tell in an instant when the "scene" shifted from "on air" to "off air."

God, I loved that show.
__________________
Did you see that ludicrous display last night?
Reply With Quote
  #24  
Old 06-17-2001, 07:02 PM
Doug Bowe Doug Bowe is offline
Charter Member
 
Join Date: Mar 1999
Location: El Paso, TX, USA
Posts: 2,745
Hail Ants...Help me fight (my) ignorance.
I was told that the shifts from video to film on Monty Python were less noticable when viewed as they were ment to be viewed on British TV.
Reason?
European TV runs at 25 fps (something to do with electricity there being at 50 cycles vs. North America's 60 cycles) and sound film in Europe was also shot at 25 fps.
Is there any truth to this assertion?
Reply With Quote
  #25  
Old 06-17-2001, 08:05 PM
beagledave beagledave is offline
Guest
 
Join Date: May 1999
I vaguely recall hearing that Hill Street Blues used to "grind" the film between plates of glass to give it an even "granier" or grittier look.

Anybody know if this is true or not?
Reply With Quote
  #26  
Old 06-17-2001, 09:08 PM
wolfman wolfman is offline
Guest
 
Join Date: Mar 2000
The difference between quality is quickly shrinking. A lot of the difference in feel between film and video is that people who chose video chose it for the reason that other people didn't like it. They liked the harsh sharpness of the picture. Other people saw that and assumed that that was the nature of video(and up until recently it probably was). But when Lucas shot a scene for episode I in HD, and decided it was what he wanted to shoot all of Episode II in, people started to review HD. 24p elimanates of the harshness, and naturally gives a feel much closer to film. As cinematographers learn to master the medium with mist and FX filters and lighting in general to perfect the feel they want, along with technology continuing to advance, I think we are in the last stages of film as a dominant medium for commercial moviemaking. Especially as corporate beancounters start to realize that a $90,000 HD camera is comparable to a $600,000 35mm camera. Not to mention the production cost differences others have mentioned.
__________________
Please, gentlemen. We must put an end to the bloodshed. We have all seen too many bodybags and ballsacks.
~~~Head of Henry Kissenger
Reply With Quote
  #27  
Old 06-18-2001, 12:31 AM
casdave casdave is offline
Member
 
Join Date: Mar 2000
Posts: 7,550
Doug Bowe

Our mains frequency is 50Hz and so our frame rate on tv are related at 25Hz.

One day we may get HDTV with its much higher refresh rates(this is because it has to be digital to get the bandwidth - similar to a monitor) but I ain't holding my breath.
Reply With Quote
  #28  
Old 06-19-2001, 07:13 PM
Hail Ants Hail Ants is offline
Charter Member
 
Join Date: Jan 2000
Location: NY USA
Posts: 5,474
KneadToKnow:
Quote:
I can't verify the technical accuracy of this, but I can definitely say that whatever they did worked. You could always tell in an instant when the "scene" shifted from "on air" to "off air."
To be honest I can't verify it either. I'm not in the industry or anything, I'm just an enthusiast who watched the show. But I would bet my life on it.

It doesn't seem that easy to most people but for me, distinguishing between video and film is almost as easy as between B&W and color. I look at the footage and its just blatantly obvious (barring any technical fiddling like FilmLook).

Another example of a 'dual-shot' show is Bob Newhart's Newhart, the 80s series set in a bed & breakfast. The first season of the show was shot on video and looks amazingly cheap compared to the filmed version.

Also the UK show Dr. Who (the Tom Baker one anyway) used the video indoors/film outdoors method.
Reply With Quote
  #29  
Old 06-19-2001, 09:04 PM
jab1 jab1 is offline
Guest
 
Join Date: Sep 1999
So did Hitchhiker's Guide to the Galaxy, though they didn't do much outdoor work.
Reply With Quote
  #30  
Old 06-20-2001, 03:05 AM
DVous Means DVous Means is offline
Guest
 
Join Date: Mar 2000
Isn't the demarkation between film and vidoe is extremely blurred these days? Especially with all the computerised special effects that are inserted into many movies now. These effects require film shots to be digitised, manipulated, edited and streeamed back onto film as analogue data.

In a way, it's like the debate that occured amongst audiophiles when CD's were first launched onto the popular market. There were many "golden ears" who claimed that they could hear the difference, but what they forgot was that many of their analogue sound sources, such as LP's, were in fact recorded and produced in digital form before being converted to analogue for publication.

What I am suggesting is a similar argument re the OP. How would you know just what bits of a block buster like "Pearl Harbor" were shot on film, compared to what has been digitally enhanced along the way?
__________________
Knock softly but firmly, 'cause I like soft firm knockers...
Reply With Quote
  #31  
Old 06-21-2001, 01:33 PM
jab1 jab1 is offline
Guest
 
Join Date: Sep 1999
Well, when people ask "Film or videotape?" they want to know what was the original medium, not the final product.

As for not being able to tell if you're looking at original film or digital when at the theater, that's kind of the whole point. They don't want you to notice. It's only when you see something that simply could not have been done live that you can confidently say "Hey, that was CGI!" like the blowing up of the Arizona.

One of the best examples of the subtle use of CGI is in Erin Brockovich. The film opens with Erin's car wreck. They show Julia Roberts get into her car and drive off and hit another car in a single, uncut take. Unless Julia's learned stunt driving on her days off, it must've been some kind of CGI effect, wherein they substitute her car for a digital replica just before the crash. Or they digitally splice together two strips of film, one with Julia driving the car and one with stunt drivers doing the wreck. And there is a credit for Visual Effects at the end of the film.

BTW: A brand-new series that uses both video and film is the ABC series The Beast, a drama about an MSNBC-like news channel. (The conceit is that the behind-the-camera scenes are on live 24/7, either on cable or on the Web.) All behind-the-camera scenes are shot on film, all on-camera stuff is done on video.
Reply With Quote
  #32  
Old 06-21-2001, 05:25 PM
bughunter bughunter is offline
Guest
 
Join Date: Apr 2001
Many of the things that have been said already about "texture" and "depth" and "dynamic range" and the general look and feel of videotape vs. film are true because of fundamental differences in the way that the two recording media respond to photons.

Light interacts much differently with film emulsion than with the silicon chips inside video cameras. I will attempt to give the details in everyday language without corrupting the truth... but I'm sure if I screw up somone will correct me.

I work with these silicon chips every day. They're called Charge Coupled Devices, or CCDs for short. Professional video cameras all use three of these, one each for Red, Green and Blue. Each one is an array of electron buckets that collect electrons generated by the light from the scene as it is absorbed in the silicon CCD. After waiting to collect enough electrons (typically 1/60th of a second for a video field) each bucket is read out serially to form a string of numbers that can be arranged in order again to reconstruct the scene, one image for each color. These monochrome images are then combined to make a full-color image.

In film, a photosensitive chemical that has been deposited in microscopic "grains" on a plastic substrate is exposed to the scene. As the light strikes these grains, the chemical properties of each grain is changed to reflect the intensity of the color light to which it is sensitive at the place on the scene where the grain sits. Color film uses three different chemicals, each sensitive to a different color of light. Once exposed, the film is developed by bathing it in a solution that causes the grains that have been altered to change color, and then "fixed" in another solution that renders the emulsion insensitive.

These are fundamentally different processes at the point where the information from the scene stops being light and starts being something else... a recording. This results in differences in their responses to detail, color and intensity.

Intensity first; it's the easiest. Silicon CCDs respond linearly to light intensity. This means that in a properly lit scene, each photon that comes in creates the same amount of charge, and so is counted equally. But emulsions don't work that way. Their response is logarithmic - the first few photons that hit a grain cause much more of a change in the emulsion than the last few.

As a result, the average effect of a single photon in a darkly-lit grain is much greater than the average effect of a photon on a well-lit grain, and also makes it a bit harder to saturate a grain. Thus, film possesses a much, much better dynamic range -- the ability to capture details in shadows and bright lights. A part of a scene that would look completely dark or washed out on video would still contain detail on film. This also gives directors a lot more artistic option and control over how a scene looks, and makes feasible a lot more sources of natural lighting. Using this dynamic range allows a director to gives an impression of depth to a scene, or communicate mood.

Cartooniverse touched on this when he described how difficult video shoots are to light. Since details in bright areas and dark areas are lost, scenes must be more uniformly lit when shot on video. And this frequently makes a scene look "flat," or "cold." Mood lighting on video is tough, and effects like backlighting and lens flares are difficult, if not impossible, to pull off aesthetically.

Color: In emulsions, the color you see is a property of the emulsion. The emulsion is sensitive to a range of colors, but those are not necessarily the exact same colors the emulsion will take on when developed. A good example of this are the old specialty cinema stocks of the Sixties and Seventies (the names elude me now... Technicolor, etc. ???). These had some not so subtle departures in color registration that made for interesting aesthetic effects, like enhancing a starlet's blue eyes or red lipstick. And these emulsion tricks are still used today, although much more subtly. Film stocks are chosen for the kind of lighting and the mood the director wishes to evoke. Go take a look at Three Kings and notice how different the hues and saturation are from something like Saving Private Ryan for instance.

Now in video cameras, the light is filtered before it reaches the CCD. There's one CCD with an appropriate filter for blue light, and one each for Red and Green, also. But once the light is past the filter, the CCD really doesn't care what color it is. And once it becomes digitized, that information is just a number - it has no inherent color. The editor can make that blue channel represent any color he wants on screen. (Of course, an editor could do that with film after its digitized, too, but the emulsion has already done it's job "interpreting" color. Digitized film still carries the influence of the emulsion.)

Another important difference in color is that videotape only uses three numbers to represent color. This limits the number of different colors that can be represented. On a CIE color map, which is sort of a rounded triangle shape, RGB or any other three-number color scheme cannot cover the entire map. Some of the extreme colors, like deep reddish and purplish blacks, just cannot be represented. They cannot even be represented with digitized film. The only way to reproduce these colors is to use an analog medium from recording to screening. (Or to change the entire video industry to a four-color format!)

Finally: detail. The standard NTSC broadcast format is equivalent to 460x480 pixels. VHS, DVD, and other formats have slightly different resolutions, but essentially the same. This makes for a total of about 0.25 million pixels. On film, where the number of grains on a frame varies according to their size (which determines the speed of the film, e.g. its sensitivity to light), there can be as many as 10 million individual grains per frame, or more. (This assumes a 10 micron grain size on a 32 mm square frame. I couldn't find any manufacturer numbers on grain counts.)

So, there's 40 times as much information recorded on film? No, not exactly. Film is a lot noisier, and grains are radomly shaped and randomly sized (within a range) and randomly arranged. This creates a lot of random noise that requires oversampling to eliminate. But then, sometimes this random noise is desired, and is used for effect. It's called "graininess." And so a much faster film with larger grains is selected.

But video isn't immune from noise, no indeed. In fact, video is susceptible to worse noise problems than film. Film's noise is random, and therefore easier to hide and more tolerable. Video noise is usually filled with lots of contrast and jagged edges that seldom have any aesthetic appeal, and can be impossible to hide.

Have you ever seen someone wearing a small-print plaid jacket on the news? Notice how the pattern goes wild on your screen? And gyrates crazily when the person moves or the camera pans or zooms? That's called aliasing. It happens because the pixels on a CCD camera are arranged in a perfectly rectangular array. If you attempt to use such an array to image another rectangular array, the recording you get will not necessarily be what you expected. The mathematics are a bit beyond the scope of my article here... just take it on faith. I can say that the same math is responsible for making wagon wheels look like they're spinning backwards, except in that case, it's because you're sampling regular intervals of time, not space.

This same aliasing is also responsible for "video crawl" - ever notice how sometimes the vertical edges of things on screen have a sort of crawling marquee appearance? Same deal. And you don't see it on film. In fact, when film is sampled for digital editing and videotape, special processing is included to prevent aliasing from ruining the advantage of usnig film in the first place.

The only time I can remember aliasing being used on purpose is on the original Star Trek. Lt. Uhura had a bizzare, spidery looking screen at her console that rotated and produced hypnotic patterns. These were Moire patterns created by two rotating filters with different pitch screens. The potential usefulness of ailiasing as a cinematography tool are very few. Usually they're totally ugly.

I'm not certain, but I suspect that some video cameras intentionally oversample to help minimize the instances of aliasing.

One more interesting note: a colleague of mine worked at a place where she was designing a cinematic CCD chip for 16/9 format cameras that the parent company wanted to sell to the studios exploring digital cinema. They were well along in the design, and were testing the readout speeds when they found out their chip was TOO good. They had assumed that a higher resolution was better, so they made their chip something like 4000 x 2250 pixels: 9 million pixels! But the studios they tried to sell this camera to weren't interested. The studios' market research had determined that the average movie audience couldn't distinguish anything much better than 1024 x 768 pixels, and so they didn't want to burden themselves with all the extra memory and processing power necessary to handle 9 megapixels, when 1 megapixel would do. So my friend's cinema video chip was sidelined.

Ugh! When I heard this, I stopped looking forward to digital cinema, and began dreading it.

And don't even get me started on compression. This ain't the Pit -- I might get in trouble.

Ugh - I just previewed this... Longest. Post. Ever!
__________________
"Unchecked right-wing media power means that in the United States today, no issue can be honestly debated and no election can be fairly decided." -- David Brock, former conservative journalist and "right-wing hitman," author of Blinded by the Right: The Conscience of an Ex-Conservative
Reply With Quote
Reply



Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 01:43 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@chicagoreader.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Publishers - interested in subscribing to the Straight Dope?
Write to: sdsubscriptions@chicagoreader.com.

Copyright 2013 Sun-Times Media, LLC.