Frames-per-second confusion

Most films, as I understand it, are shot at 24 frames per second (FPS)

They look, well, just fine.

But on the other hand - I’m a gamer. And I can tell, without fail, that 24 FPS in a game looks like crap. Choppy, distracting, awful. Heck, I can usually see when a game drops below 60 FPS - and I hate it.

Why is it that a film at 24 FPS is convincingly fluid but a game must be at a minimum 60 FPS or it bugs me?

I give up - who decided to call a digital video feed “FPS”? Isn’t is confusing enough that it can already mean either “frames/second” (as in motion pictures) or “feet/second” (magnetic tape)?

I now await yet more terms abbreviated FPS.

How about “First Person Shooter”? … which is probably what **GameHat ** is playing at 60 FPS.

There are a couple of things going on. A frame in a movie is “temporially anti-aliased” which is just a fancy way of saying it contains motion blur. When the film is being shot the shutter is open for long enough that really fast motion gets “smeared” in the image. This smearing prevents the sort of stuttering effect you get when you play a game at a low frame rate.

You can fake motion blur in videogames but in order to do it you have to render several frames in sequence and merge them together. But that takes just as much processing power as running at a higher frame rate and not doing motion blur … so most games don’t do it.

The other thing that’s going on is that film cameramen tend to avoid the sorts of camera movements that look bad at 24 fps. You don’t tend to see a lot of very fast pans for example. But in a videogame the camera is under player control and there’s nothing to prevent them from executing the sorts of fast rotations that really look bad at low framerates.

Pochacco, I had never thought about that before, but of course it makes perfect sense. Thank you for that illuminating tidbit. :slight_smile:

Another factor is that by the time things are dragging so badly as to reduce a game’s framerate to ~20fps, the responsiveness will probably be suffering as well, which can make the motion of things even jerkier.

Movies are not shown in 24fps; they are typically shown in 48fps, with each frame shown twice.

See here for info.

The shutter is typically two-bladed, covering the light source twice during a full revolution. The projectors I worked with in my projectionist days did exactly that, and if you rotated the motor by hand (quite important for checking the framing after threading), you could observe this.

One factor that I imagine has an effect on apparent quality in grainy film: the film grains are in different random places for each shot, meaning that a rapid succession of frames will have lots of totally random grain filling in the gaps left by prior random grain, making a much smoother picture than one formed by an array of fixed pixels.

I agree with Pochacco that the “smearing” has a lot to do with it as well.

Which decreases the flicker as perceived by humans but does nothing to motion smear or the lack of it. It’s the same frame that is projected twice each time; even the grain is the same.

What I’m saying is the double-projection isn’t a factor in this discussion, at least in the film/video comparision.

That’s a bit disingenuous. There are 24 unique images per second. Modern projectors cycle the shutter twice on each one to make the movie look a bit better (by reducing flicker). But that doesn’t mean it’s 48fps.

A good article on the subject (although without any cites, and I have no idea what the author’s qualifications are).

As I recall, that was the insight behind Ray Harryhausen’s Dynamation stop-motion process–he added a bit of blur so it looked more realistic.

I’m gonna tack on this question because the thread title is perfect. There is a currently running commercial for a Visio HD television where the main selling point is that it displays 120 fps versus the customary 60 fps. Is an HD tv signal really meant to be displayed at 120 fps or does this unit simply display the same frame twice?

Musicat and friedo, go ahead and watch your films in true 24fps if you don’t feel it’s a factor.

Musicat, I don’t know if this is relevant or not to the discussion, but just as you nitpicked on me for implying 48 true frames per second, I was nitpicking that the shutter opens 48 times per second and not 24.

But isn’t the threshold of human perception around 1/26 sec.? That’s why movies work in the first place. Why do people perceive quality differences in frame rates faster than that?

Important to remember that 60 fps in games can also stil mean you get stuttering due to loading data or the like, eg a .1 second delay between two frames, with the other .9 of the second showing 54 frames.

So even though thats 55fps for the entire second, you have a period of that second where its the equivalent of 10fps.

Games often arent uniformly smooth, so fps isnt always a good picture of how fluid the view really is.

Otara

Who said it isn’t a factor?

You made an incorrect statement that movies are projected at 48fps. Musicat and I pointed out, politely, that this is an inaccurate description of what is actually happening. Now I will point out somewhat less politely that you are simply wrong, and are now arguing an irrelevant point instead of admitting that “48fps” is an incorrect and entirely misleading description.

It’s even worse because there are some film formats that do project at true 48fps, such as some 70mm formats. And there are (rare) 35mm projectors that have triple shutters. A projectionist would look at you like you were nuts if you told him such a thing projects 72fps.

The point of GQ is to answer questions factually; you shouldn’t take it personally if someone corrects an error or points out a clearer way of explaining something.

I felt that I was answering this question well: “Why is it that a film at 24 FPS is convincingly fluid but a game must be at a minimum 60 FPS or it bugs me?”

I still firmly believe that the double shutter helps in making film seeming convincingly fluid even when it is truly shot at 24fps. I seem to be having trouble putting this down in words, however, as you both have kindly pointed out.

It is not irrelevant to the discussion IMHO, and is in the spirit of GQ.

And about those other formats – do I need to cover every strange format ever produced to satisfy the goals of GQ?

Perhaps we need someone who is better versed in optical phenomena than I am to clarify, but it seems to me that fluidity or smoothness of motion would not be perceived differently if the length of each image’s projection time were halved and the number of times each frame is displayed is doubled.

I believe the double shutter mechanism is intended to reduce flicker, which I have no doubt it does. But it has no effect on perceived motion.

Let’s try it another way. Cartoons are often created with every 4 frames identical, to save drawing or rendering time. They often appear jerky to some, including me. Do you claim that projecting 8 frames of the same image in the same time as 4 would make any motion between different, adjacent frames appear smoother? And if so, how? What human optical or brain property would make this possible?

There is also the fact that in a video game seldom is the frame rate smooth. In the OP’s hypothetical game at 24 FPS the actual; frame rate is likely to fluctuate between say 20 and 28. This only adds to the choppy, stuttery feel of a game struggling at that low frame rate.

No, I definitely am not claiming that, and your example does provide a way of cleanly separating the flicker from … everything else (fluidity?).

I figured flicker did play a part based on the OP: “And I can tell, without fail, that 24 FPS in a game looks like crap. Choppy, distracting, awful. Heck, I can usually see when a game drops below 60 FPS - and I hate it.”
Perhaps just a minor role.

Extending your example: do those cartoons look smoother than the 24FPS game or not? If I understand what you said, the cartoons would be a true 6FPS (1/4 of 24), so they should look totally awful in comparison even with the FPS game.
If they still look smoother then I imagine that they are a restatement of the OP’s case in somewhat amplified terms (film with lower FPS still looks better, regardless of if it’s 6 or 24).

Probing the cartoon example – why do they look bad but not terrible – might give insight into the answer to the game/movie question.
Which well-known cartoons can we use as reference?

I think that’s probably true. The reduction in flicker probably has the side effect of improving the illusion of motion.

I agree.

Of course not; I just wanted to make it clear that frame rate and the shutter operation are two completely different things, which both work together to contribute to the overall look and feel of the projection. The only reason I pointed out the more obscure formats was to make the point that there really are systems that show 48 actual frames per second, and this is much different than 24 frames per second with a double shutter.

Now if you really want a nitpicky discussion, just start asking questions about the word “anamorphic.” :smiley:

And of the hundreds of films I showed in my projectionist career, every single one of them was 24fps with double shutter (though the double shutter was totally related to our Century projectors and not the film).

I occasionally heard an oldtimer telling about strange formats from a bygone era where they had to change the gearing (IIRC one fellow said “Oklahoma” was such a film).

I haven’t been in the business for years, definitely before digital came on the scene, but I imagine that the hardware for standard Hollywood films is pretty much the same.