Why does television and movies run 24 frames per sec, but games run at higher fps?

Some games run as high as 60 frames per second, but we’re still not quite at film quality games. Television and movies run at 24 frame per sec, yet a lot of games would be considered sub par at that rate. What’s the difference?

Motion blur

Rysto is, of course, entirely right. But it’s a good question.

What you experience when you turn your head quickly, is motion blur. Since the lense in a film camera is also dependant on the same kind of light-to-sensor activity to make out something clearly, it will behave in the same way. That is to say, the motion will be blurred until we can refocus.

Video game “cameras” doesn’t have this property (at least, not until recently - see for instance Mass Effect) as well as the property of focus changes. So everything is rendered crisply, which means that things will appear very unnatural and far from smooth if run at only 24 frames per second. A higher frame rate will smooth out the motion and newer games are compensating for the lack of “sympathetic sight effects,” such as motion blur, focus ranges, light dilation, colour confusion and so on.

The chief reason - today - that we are not seeing film-like video games is not in the resolution, the frame rate or object detail of the game diegesis, but rather that until recently video game cameras have lacked the properties that makes it feel like an eye. Somewhat older games - like Company of Heroes - let actions in the diegesis, in the world, affect the camera “lens.” If you were close enough to see a tank shell explode in the mud, it would spatter the camera and the reverbations would make the camera shake. Bullets would ping off dust from the ground and it would make a temporary film on the camera lens.

Other games have effects for when you emerge out of water, the water would drip off the lens and so forth. I expect - and this is of course merely my opinion - that when the video game camera is made to operate more like a film camera, gams will become far more immersive. I also think this is an area that’s been regrettably underdeveloped in every genre outside the FPS and even in modern FPS games it’s underestimated. (This pet peeve made Gears of War all that more dear to me.)

Running Mass Effect at more than 24 FPS—Ha!

Valete,
Vox Imperatoris

Or, in the case of my poor little notebook, running damn near anything at more than 24 FPS—Ha! H-ha… :frowning:

Actually, Television runs at 30 frames per second.

Am I the only one who can’t tell worth a damn what frame rate a game’s running at, so long as it’s 30+ (to wit, non-choppy)? I mean, I can certainly tell when a game that generally runs at 60fps drops to 30fps for certain portions (such as when playing split-screen), but had the game run at 30fps instead the entire time, I would have never known.

I almost feel like it’s a waste of processing power. I just wish developers would opt for a lower framerate if it meant more consistency. I hate slowdown with a passion and would rather not be distracted by it.

Thanks for the responses Gukumatz and Rysto, that clears things up a bit. I’ve wondered about this for a while now.

In Europe. In the US and Japan, it runs at 29.97 frames per second.

No, the 30 is a rounding of the NTSC standard.

PAL and SECAM (the two standards used in Europe, Australia, Africa, most of Asia, and chunks of South America) both come to about 25 FPS.