I think you’re confusing a few things. Frames per second is the refresh rate (or framerate), not the resolution. The resolution is a measurement of how many distinct dots (or lines) are displayed, not how many times per second they are displayed. A movie looks better on film than on a TV because a TV only has a few hundred lines of pixels. 35mm film, however, has many times that resolution, which gives a much clearer picture.
When film standards were first developed, taking many pictures a second was both expensive and a difficult engineering problem. 24 frames per second is fast enough that the persistance of memory effect (what makes you see a moving picture out of a sequence of stills) works well enough, but slow enough that it was possible for early movie cameras to handle. You may have seen old movies where the action seems sped-up. They didn’t do that for laughs; they did it because they couldn’t get cameras to record enough images to play back at a fast enough framerate, so they had to record fewer and speed it up.
Sure, that’s not a problem anymore, but there’s a huge installed base of equipment that works at 24 frames per second. So something’s got to be a whole lot better to warrant replacing all of that. But there are other standards, the most popular of which is probably IMAX. IMAX film is larger than 35mm, and, IIRC, they shoot it at twice the framerate. And it looks awesome. But it requires more expensive cameras, film, and theaters, so it doesn’t get used much. Until recently, it was generally only used for documentaries. Since movies have started to be “filmed” digitally and most of the actual image made in post production on computers, more mainstream movies have been released for IMAX. I saw one of the Matrix sequels on an IMAX screen. It still sucked, but it looked awesome.
TVs use a different frame rate because they have to deal with AC electricity, which in America cycles at 60 Hz. In Europe they use the PAL video standard at 25 fps because the electricty is at 50 Hz. I’m sure that someone more EE-inclined than I will be along to explain it better, but the upshot is that when your power source is oscillating at a certain rate, it’s generally convenient to do things in multiples of whatever that frequency is.
When you watch a movie on TV, since they have to get six extra frames per second from somewhere, they generally just repeat one frame in every 4. It’s also possible to do some interpolated stuff, so you get partial frames that blend better, but I don’t know if that’s actually done regularly.
Your brain tends to have trouble distinguishing things that occur faster than about 1/10th to 1/15th of a second apart. Play two sounds together this closely and they get blended together and seem like one sound. It’s the same with video. Anything faster than about 10 or 15 frames per second gets blended together into smooth movement. Anything slower starts to look “jerky” because your brain can detect the individual movements.
Silent film used to run at 16 fps, but this is kinda close to the limit for what you can see as being jerky. When they switched to color film, they also upped the standard to 24 fps, which being a bit faster, is a little smoother for your brain.
The main reason is electrical noise. If your power supply runs on 60 Hz, you are going to end up with 60 Hz noise everywhere in your system and it’s hard to get rid of it all. If you synchronize your frames to the power source, you effectively eliminate your largest source of electrical noise. If you used a different frame rate, older TVs would have noticable moving “bars” of lighter and darker stripes moving up and down the screen at an annoying rate.
Soap operas look different than movies for a lot of reasons. One is that the sets are lit completely differently, which has nothing at all to do with what is recording the video. Soap operas have bright lights coming at you from all directions. The Matrix has very dark lighting often coming from a single direction to intentionally produce shadows across the set.
Another reason they look different is that film is “grainy” because it is literally made out of little grains of photo-sensitive material. Sometimes film makers will intentionally use a grainier film to get more of a “gritty” look. There’s a lot more variation you can get in the look and feel from film just by varying the chemical makeup and construction of the film itself, without changing anything at all in the camera. Video cameras, by comparison, have a CCD array in it which picks up the light and converts it into electrical signals. You can’t change the physical construction of the CCD in between takes. It’s built into the camera and you’re just not going to easily muck with it. A lot of TV shows will shoot on film then convert the image to video for broadcast just because they can get different looks out of it that they can’t get from just a simple video camera. Sitcoms and cheap shows certainly aren’t going to bother with all of this.
Hmm, as I understand it film is shot at 24 frames progressive (full frames) while TV is 30 fps interlaced and that has something to do with it. That’s the reason that the new digital cameras such as the Panasonic DVX100 that have “24p” capability are all the rage with independent filmmakers. Perhaps someone with more knowledge of it than me can come along and explain exactly what this is about?
So you are saying that that IMAX is almost at TV refresh rate and with a much higher resolution?
Ever seen a movie scene being shot, shown on some TV program?
For example some kind of action sequence, it looks so silly on the TV program, but the same scene in the movie looks real and cool? Shouldn’t it look more real with the more realistic (in terms of frame-rate) camera? I don’t know if i’m making sense.
Does lower frame-rate make a scene look more real/better/believable ?
There are more factors to consider. Very early movie cameras were hand-cranked, and the exact speed not only a guess, but them hands must have gotten pretty tired pretty quick. A too-slow camera, when played back on a motor-driven standard-speed projector, will show action speeded up.
Also, when sound was added to the film, a faster film speed made things better for sound quality, too.
The frame rate is pretty much beside the point, as long as it is fast enough for persistence of motion in the brain. You really need to get away from the idea of “higher frame rate = better quality”. If you had a VGA-quality video camera and somehow rigged it up to run a 200fps, it would still look terrible, because the resolution is so poor. Video has poor resolution, 35mm film has excellent resolution.
The other main reason why stunts etc look rubbish on “making of” shows is that you can see all the other detritus and equipment around the scene, and the lighting is not set-up for the TV cameras - it’s set up for the film cameras. Plus the guy yelling “that’s a wrap” and mugging for the TV cameras kind of spoils the effect.
The flip side of this, of course, is that if you shoot on digital, you can apply all those effects after the fact with digital filters. That way you don’t even have to decide what kind of look you want until you get to the editing room. As digital cameras get better and cheaper, and the computers running the filters faster and cheaper, film will be used less and less.
Consider this a nitpick, but there are advantages to higher frame rates both for cameras and projectors than the bare minimum of 24fps (which is one reason why the double-frame projection is used in theaters to make an effective 48fps rate as Pleonast mentioned). For cameras, a faster frame rate will freeze motion better, making it more suitable for sports. For projectors, a faster rate will provide smoother movement as the differences between individual frames is less.
If you had a VGA-quality video camera, and rigged it up to run at 200fps, it would only look terrible if you sacrificed resolution for frame rate (to get it all on the same storage medium, perhaps). If you could speed it up without altering any other parameters, the visual quality might not improve much beyond a certain point, but it would not deteriorate, either.
I find a slow pan of a scene, especially in a cartoon where there is no natural motion-blur, to be very annoying and jerky to my own eyes/brain at standard film rates. YMMV.
One thing that hasn’t yet been mentioned in this thread, but certainly belongs here, is the conversion from 24fps to 30, as in film to video. The standard method prints one frame, one frame, one frame, then two frames (of the same original image). This doubling of every 3rd frame is not significant for humans to notice, but the sound has to be slightly adjusted to re-sync, again, not noticeable. It’s a compromise we can live with.
Conversion from 24fps to 30fps usually has a negative impact on a movie.
If you can get your hands on both a DVD that was converted from a 24fps movie, and one that was shot with a 30fps digital camera, pause them both and look for the “messed-up” frames in the theatrical movie.
It may be easier to spot with animated movies, compare The Emperor’s New Groove to Kronk’s New Groove at a frame-by-frame level. There aren’t any “messed-up” frames in the latter, as it was drawn at 15fps, with each frame shown twice to get to 30fps.
No–film reacts differently to light and color than video does. It’s not merely a matter of graininess/resolution/fps/etc. Film (when done well) has an almost artificial “richness” that video cannot yet reproduce.
Certainly does, but in the UK (which uses 25 fps) there’s a simple solution - do nothing at all. If a movie is shot at 24 fps and played at 25 fps nobody will notice the slight speed-up, except a few people with perfect pitch who’ll hear all sounds half a semitone sharp.