Movie frames per seconds vs TV

But then . . . the movie is over in 24/25ths the time it’s supposed to be! Man, what a rip-off.

Here’s a page from Kodak that summarizes why film spanks video — though they don’t seem to use that term.

I suppose somebody has to come in and point this out…

Our frame rate for NTSC video in the U.S. is not 30 frames per second (fps). It’s actually 29.97 fps. I interviewed quite a few old video engineers about this when writing one of my books, and I got several different explanations for this particular choice. The most prevalent was that a frame rate of 30 fps (which would be at 60 Hz because we use interlaced frames) produced a hum that was very difficult to get rid of. Changing the frame rate ever so slightly fixed the problem.

If that is the case, how do you explain the failure of special effects guru Douglas Trumbull’s Showscan format? Large format 70mm film, at 60 frames per second, but he abandoned it because it looked too much like video. Maybe TV’s 60 field-per-second refresh rate is part of what makes it look different to film.

Ahem

My father (who has done radio/television repair and signal processing engineering for years) explained it to me as a byproduct of adding color to the NTSC standard. In order to make color broadcasts compatible with b/w TVs, they had to do some funky encoding of the color, and the frame rate was changed a bit to get rid of interference between the color encoding and some other signal. Wikipedia confirms.