Pointless video cards?

Lately Ive been seeing people talk about video cards that can do 240 or even 300+ frames per second. Is there any point to this? It seems too me that first of all, since the human eye is only capable of about 60 fps the a frame rate of much over 80 in completely wasted. And even if the human eye does get some benefit out of that high of a resolution, I’ve never seen a monitor that can refresh at much more than 90 hertz at high resolution anyway. Which makes the frame rates double-wasted.

So my question is this. Am I missing something here? or are they lying about frame rates or using some strange measurement so that these ranges actually are part of a discernable difference? Or are the video card companies simply hoping that people will spend money to upgrade to somthing with a higher number even though that number has no benefit in the real world?

Video cards can paint the screen more frequently than the display does because it only means writing to the screen memory. Actually it usually means writing to a second buffer, and then switching the back buffer for the first buffer.

The reason FPS is quoted so much is that it’s a benchmark for performance (and typically for games). If you’re playing Quake 3 or something like that, and one video card updates at 60 FPS, and the other at 120 FPS, you know that the second card is more powerful, and that with more complex scenes it will outperform the first. Applications will eat up this power as quickly as it comes out I guarantee. So you might do 120 FPS now, but later it might do “only” 45 FPS on another game (acceptable) whereas the first card might only do 20 FPS (not acceptable).