Movie frames per seconds vs TV

I’ve always wondered why movies are shown at 30 fps ( I think ) and TV programs like soaps are shown at 60 fps.

Maybe the movie theaters in the old days couldn’t handle high frame-rate but that shouldn’t be the case now. However, I think movies are more pleasing to the eye, somehow better. Is it so? How come?

If lower fps are better then why are TV soaps not shot at the lower frame-rate?
Or if higher fps is better, then why not movies too? Why the difference?

I know Matrix would look silly looking like a soap opera but why? I just can’t put my finger on it.

Thanks

Bonus Qns:
At what frame-rate do our eyes work? And where can I buy glasses to make my natural boring vision look like a movie?

Movies are 24 fps. TV is 30 fps (though technically, it’s 60 fps for each half frame).

Why spend all the money to develop new cameras and projectors when you’re not getting any advantage? There’s no reason to change the standard.

The difference in look is due to the difference in resolution of film vs. video.

Thanks,

But if I watch a movie on TV, I think the resolution would get degraded to TV resolution, so is the only difference 12 fps?

I feel there is something more. Movies feel so different. How would a movie look, shot on a TV camera?

I think you’re confusing a few things. Frames per second is the refresh rate (or framerate), not the resolution. The resolution is a measurement of how many distinct dots (or lines) are displayed, not how many times per second they are displayed. A movie looks better on film than on a TV because a TV only has a few hundred lines of pixels. 35mm film, however, has many times that resolution, which gives a much clearer picture.

When film standards were first developed, taking many pictures a second was both expensive and a difficult engineering problem. 24 frames per second is fast enough that the persistance of memory effect (what makes you see a moving picture out of a sequence of stills) works well enough, but slow enough that it was possible for early movie cameras to handle. You may have seen old movies where the action seems sped-up. They didn’t do that for laughs; they did it because they couldn’t get cameras to record enough images to play back at a fast enough framerate, so they had to record fewer and speed it up.

Sure, that’s not a problem anymore, but there’s a huge installed base of equipment that works at 24 frames per second. So something’s got to be a whole lot better to warrant replacing all of that. But there are other standards, the most popular of which is probably IMAX. IMAX film is larger than 35mm, and, IIRC, they shoot it at twice the framerate. And it looks awesome. But it requires more expensive cameras, film, and theaters, so it doesn’t get used much. Until recently, it was generally only used for documentaries. Since movies have started to be “filmed” digitally and most of the actual image made in post production on computers, more mainstream movies have been released for IMAX. I saw one of the Matrix sequels on an IMAX screen. It still sucked, but it looked awesome.

TVs use a different frame rate because they have to deal with AC electricity, which in America cycles at 60 Hz. In Europe they use the PAL video standard at 25 fps because the electricty is at 50 Hz. I’m sure that someone more EE-inclined than I will be along to explain it better, but the upshot is that when your power source is oscillating at a certain rate, it’s generally convenient to do things in multiples of whatever that frequency is.

When you watch a movie on TV, since they have to get six extra frames per second from somewhere, they generally just repeat one frame in every 4. It’s also possible to do some interpolated stuff, so you get partial frames that blend better, but I don’t know if that’s actually done regularly.

Old time movie cameras and projectors utilized a geneva mechanism to advance the film. 30 fps to allow persistence of vision phenomenon to make motion appear lifelike.

Special order from B&L. Electric control to total block vision for a few micro seconds in pulses at 30/min.
Voila! "Instant movies. :rolleyes: "

I think some of this is video camera (like used for soaps and live-audience sitcoms) vs. film. Have you seen Blair Witch Project? I think that was shot with video cameras.

Lots of info on this topic in this thread.

Your brain tends to have trouble distinguishing things that occur faster than about 1/10th to 1/15th of a second apart. Play two sounds together this closely and they get blended together and seem like one sound. It’s the same with video. Anything faster than about 10 or 15 frames per second gets blended together into smooth movement. Anything slower starts to look “jerky” because your brain can detect the individual movements.

Silent film used to run at 16 fps, but this is kinda close to the limit for what you can see as being jerky. When they switched to color film, they also upped the standard to 24 fps, which being a bit faster, is a little smoother for your brain.

The main reason is electrical noise. If your power supply runs on 60 Hz, you are going to end up with 60 Hz noise everywhere in your system and it’s hard to get rid of it all. If you synchronize your frames to the power source, you effectively eliminate your largest source of electrical noise. If you used a different frame rate, older TVs would have noticable moving “bars” of lighter and darker stripes moving up and down the screen at an annoying rate.

Soap operas look different than movies for a lot of reasons. One is that the sets are lit completely differently, which has nothing at all to do with what is recording the video. Soap operas have bright lights coming at you from all directions. The Matrix has very dark lighting often coming from a single direction to intentionally produce shadows across the set.

Another reason they look different is that film is “grainy” because it is literally made out of little grains of photo-sensitive material. Sometimes film makers will intentionally use a grainier film to get more of a “gritty” look. There’s a lot more variation you can get in the look and feel from film just by varying the chemical makeup and construction of the film itself, without changing anything at all in the camera. Video cameras, by comparison, have a CCD array in it which picks up the light and converts it into electrical signals. You can’t change the physical construction of the CCD in between takes. It’s built into the camera and you’re just not going to easily muck with it. A lot of TV shows will shoot on film then convert the image to video for broadcast just because they can get different looks out of it that they can’t get from just a simple video camera. Sitcoms and cheap shows certainly aren’t going to bother with all of this.

Hmm, as I understand it film is shot at 24 frames progressive (full frames) while TV is 30 fps interlaced and that has something to do with it. That’s the reason that the new digital cameras such as the Panasonic DVX100 that have “24p” capability are all the rage with independent filmmakers. Perhaps someone with more knowledge of it than me can come along and explain exactly what this is about?

Certain TV shows, usually big budget series, are shot on film rather than video. You can certainly tell the difference.

Thanks iamthewalrus(:3= ,

So you are saying that that IMAX is almost at TV refresh rate and with a much higher resolution?

Ever seen a movie scene being shot, shown on some TV program?

For example some kind of action sequence, it looks so silly on the TV program, but the same scene in the movie looks real and cool? Shouldn’t it look more real with the more realistic (in terms of frame-rate) camera? I don’t know if i’m making sense.

Does lower frame-rate make a scene look more real/better/believable ?

There are more factors to consider. Very early movie cameras were hand-cranked, and the exact speed not only a guess, but them hands must have gotten pretty tired pretty quick. A too-slow camera, when played back on a motor-driven standard-speed projector, will show action speeded up.

Also, when sound was added to the film, a faster film speed made things better for sound quality, too.

The frame rate is pretty much beside the point, as long as it is fast enough for persistence of motion in the brain. You really need to get away from the idea of “higher frame rate = better quality”. If you had a VGA-quality video camera and somehow rigged it up to run a 200fps, it would still look terrible, because the resolution is so poor. Video has poor resolution, 35mm film has excellent resolution.
The other main reason why stunts etc look rubbish on “making of” shows is that you can see all the other detritus and equipment around the scene, and the lighting is not set-up for the TV cameras - it’s set up for the film cameras. Plus the guy yelling “that’s a wrap” and mugging for the TV cameras kind of spoils the effect.

Thanks engineer_comp_geek , Rigamarole, CookingWithGas, spingears. Too fast.

Movies are not actually projected at 24 fps. Each frame is flashed more than once before moving on to the next. Thus persistence is not a problem for films.

Ah, this is what I had meant to ask in the first place.

So if I replaced the film camera with a video camera, and watched it on TV, would it look better?
(Asuming no editing, slow-motion etc is needed for the scene)

Thanks

The flip side of this, of course, is that if you shoot on digital, you can apply all those effects after the fact with digital filters. That way you don’t even have to decide what kind of look you want until you get to the editing room. As digital cameras get better and cheaper, and the computers running the filters faster and cheaper, film will be used less and less.

Consider this a nitpick, but there are advantages to higher frame rates both for cameras and projectors than the bare minimum of 24fps (which is one reason why the double-frame projection is used in theaters to make an effective 48fps rate as Pleonast mentioned). For cameras, a faster frame rate will freeze motion better, making it more suitable for sports. For projectors, a faster rate will provide smoother movement as the differences between individual frames is less.

If you had a VGA-quality video camera, and rigged it up to run at 200fps, it would only look terrible if you sacrificed resolution for frame rate (to get it all on the same storage medium, perhaps). If you could speed it up without altering any other parameters, the visual quality might not improve much beyond a certain point, but it would not deteriorate, either.

I find a slow pan of a scene, especially in a cartoon where there is no natural motion-blur, to be very annoying and jerky to my own eyes/brain at standard film rates. YMMV.

One thing that hasn’t yet been mentioned in this thread, but certainly belongs here, is the conversion from 24fps to 30, as in film to video. The standard method prints one frame, one frame, one frame, then two frames (of the same original image). This doubling of every 3rd frame is not significant for humans to notice, but the sound has to be slightly adjusted to re-sync, again, not noticeable. It’s a compromise we can live with.

Conversion from 24fps to 30fps usually has a negative impact on a movie.
If you can get your hands on both a DVD that was converted from a 24fps movie, and one that was shot with a 30fps digital camera, pause them both and look for the “messed-up” frames in the theatrical movie.
It may be easier to spot with animated movies, compare The Emperor’s New Groove to Kronk’s New Groove at a frame-by-frame level. There aren’t any “messed-up” frames in the latter, as it was drawn at 15fps, with each frame shown twice to get to 30fps.

No–film reacts differently to light and color than video does. It’s not merely a matter of graininess/resolution/fps/etc. Film (when done well) has an almost artificial “richness” that video cannot yet reproduce.

Certainly does, but in the UK (which uses 25 fps) there’s a simple solution - do nothing at all. If a movie is shot at 24 fps and played at 25 fps nobody will notice the slight speed-up, except a few people with perfect pitch who’ll hear all sounds half a semitone sharp.