Videotape vs Film

I understand the techology involved with Videotape vs Film, but could someone please explain how our eyes can IMMEDIATELY recognize the difference (especially when they are both used on broadcast TV)

Example: My eye can immediately distinguish a videotaped sitcom (Golden Girls) from a filmed sitcom (Mary Tyler Moore). I can express the difference it in very subjective terms, (video tape looks “sharper” somehow, but film has more “depth” ???

The “Resolution” of my TV remains the same, right?

The resolution of your TV stays the same, it’s about the different ways film and video display contrast. Video has higher contrast than film, which is what makes it look sharper.

Not part of your original question, but I’m not sure Golden Girls is videotaped, actually. Most sitcoms are shot on film, then transferred to video for broadcast distribution. Shooting on film preserves the material in an easy-to-use format in case they want to distribute it internationally where TV systems are different.

Also, doesn’t film have more of a distinctive “grain” than videotape? Grain being imperfections in the chemicals or chemical reaction that creates the image (just a definition off the top of my head)?

That should probably be more like “Grain being the visual result of imperfections in the chemical. . . .”

I’ve always wondered this too…and film school never gave a great answer. The closest “answer” I got was while reading a review of a new dv camera hitting the market. It basically said “it can record in 24 frames per second to mimic film stock” (as opposed to normal 30fps video).

I don’t have anything to back this up, but I assume it’s saying the eye can tell the subleties of the extra 6 fps???Not sure I believe this is the only answer, but hey…it’s a start.

No, the 3:2 pulldown (the conversion from 24 to 30 fps for broadcast) isn’t the issue. Most US-produced TV commercials are shot on film at 30 fps, specifically to avoid pulldown.

Yet they are clearly identifiable to the eye as “filmed”.

I don’t knoe the answer to the OP’s question, either.

I know from personal experience that Newhart was sometimes on film, sometimes on tape - it was the first tv show in which I noted the difference between tape and film…

hrh

Gelaan,

Not that I doubt your knowledge, how do they possibly shoot 35mm film at 30fps without making it look slightly slow? I mean in order to fit 1050mm of film through the gate in 1 second…that’s gotta add a slow motion effect to it???no?

"Although there are those who disparage the judder generated by 3/2 pulldown when 24 Hz film is converted to 60 Hz video, it is a part of the “film look” coveted by those who tell dramatic stories on television. We now have a 24P camera (720/24P) that can operate off-speed (undercrank and overcrank). "
from…

http://www.tvtechnology.com/features/Tech-Corner/f-RH-24p.shtml

another good article.

Just like in the links you posted, “fps” means “frames per second” here, not “feet per second”.

To add fuel to this thread:

I don’t think FPS makes a difference ! Tape a live football game and the football scnee from “Heaven Can Wait” and then freeeze a frame from each. YOUR EYE CAN TELL THE DIFFERENCE IN ONE FRAME!

A lot of people can’t understand my distinction, until I break it down into common TV examples:

VideoTape “Look”:

  • Soap Operas (on broadcast TV)
  • Live or taped News Broadcasts
  • Game Shows
  • TV Shows originallly captured on videotape (even when re-distributed on DVD).
  • TV Shows originally broadcast live (OSCARS)
  • Some commericals.
  • Some “low-budget” films originally captured on videotape and never converted to film - even when distributed on DVD

=========================================
Film “Look”

  • Almost all Hollywood movies, even if distributed on videotape or DVD.
  • Some commercials (if shot on film, even if transferred to videotape)
    ==================================
    Combined Film vs Video Tape Look
  • Just watch Fawlty Towers or Monty Python for examples of this – the exteriors are almost always “film” and the interiors are “Videotape”!!!
    ======================================

Has ANYONE ever seen a TV Show orignally taped on video, then convertd to a “film look” ??

Its projected at 30fps during the film to tape transfer.

I can say positively that Golden Girls was videotaped.

Newhart was odd in that the first season was videotaped (and didn’t feature Peter Scoleri & Julia Duffy) and then they switched to film.

Both the difference in contrast and the higher frame rate makes video look ‘cheap’ compared to film. Actually I think its that video looks too ‘real’. Film softens the image and at 24fps your brain can still perceive a little motion blur.

One of the best ways to see the difference is to watch Monty Python. When they’re inside its tape. When they’re outside its film. Sometimes they’ll switch back and forth between the two quickly during one sketch!

The reason they did this is because back in the early 70s portable video equipment was still too expensive compared to film cameras (plus it used a lot of power and it rains a lot in Britain!)

It makes a difference, but not the one central to this thread. As I already noted, film shot at 30 fps (that’s “frames per second” for those abbreviation-challenged among us - you know who you are :slight_smile: ) is still clearly identifiable as “filmed”.

Almost all filmed TV advertisement is filmed at the same frame rate at which it is intended to be broadcast - 25 fps in PAL-based countries and 30 fps in NTSC-based countries. Yet it’s clearly “filmed”.

The people at cinematography.net are actual cinematographers, and here’s what they have to say (they don’t come up with a definitive answer either, BTW).

http://www.cinematography.net/30FPS.HTM

“Just like in the links you posted, “fps” means “frames per second” here, not “feet per second”.”

Actually, the links I posted are talking about video shooting at 24 Farmers per season. wink. Not film cameras shooting at 30fps (what you are clearly saying). I’m going to stick with the combo of contrast & FPS until I see a clear answer…or a link to a cite of anyone shooting and projecting film at 30fps.

[/quote]
or a link to a cite of anyone shooting and projecting film at 30fps
[/quote]

It’s already there, in my post right above yours. In fact, that page (which is mostly a thread distilled from a mailing list) contains plenty of references to it. The title of that page is “Shooting at 30 fps”. You did look…right? :slight_smile:

BTW, here is one selection from that page:

QED.

Yes, I was posting while you were posting. So let’s get back to the OP. It’s clear that these guys, even in your link, are discussing the difference in frame rate in conjunction with “the film look”…which is what my original answer added.

Here in Britain, both live TV/video and film are shown at 25fps (film simply sped up from 24fps if necessary, no 3/2 pulldown), and you can still immediately tell which is which. However, if you watch TV via a computer TV card, which deinterlaces TV/video down to true 25fps, it is very hard to tell the difference. Only the “combing” artefacts give it away. So I’d say the effect is due to the higher apparent frame rate of interlaced video/live TV.