Lately Ive been seeing people talk about video cards that can do 240 or even 300+ frames per second. Is there any point to this? It seems too me that first of all, since the human eye is only capable of about 60 fps the a frame rate of much over 80 in completely wasted. And even if the human eye does get some benefit out of that high of a resolution, I’ve never seen a monitor that can refresh at much more than 90 hertz at high resolution anyway. Which makes the frame rates double-wasted.
So my question is this. Am I missing something here? or are they lying about frame rates or using some strange measurement so that these ranges actually are part of a discernable difference? Or are the video card companies simply hoping that people will spend money to upgrade to somthing with a higher number even though that number has no benefit in the real world?