What Frame Rate can Our Eyes Percieve?

I’m not sure quite how to phrase my question, but I’ll try;

What frame rate does “real life” run at?
Or, if you will, how often do our eyes “refresh” the images they see?

If this could be calculated into a “frame rate” of xFrames per second, what rate would that be?

Your eyes don’t work that way. For years, it was believed that they did, but this theory has been pretty much debunked. The closest thing is how good the human eye is at detecting very brief images or flickers, and this level changes based on lighting, where the image hits the retina, and simple genetics. There’s a wikipedia page on this, the “flicker fusion threshold.

It’s your brain that does the seeing and not the eyes. Your pupils react to light intensity and adjust accordingly.

The is a definite latency between the time an images registers in the eyes, travels up the optic nerve and later perceived by the brain. IIRC, there is a story stating if two fighter aircraft are approaching each other on a collision course, based on speed and distance, the aircraft will both collide before the image travels the distance from eyes to the brain. The pilots will never see what hits them.

If you want a simple real-world measure, some people can see CRT monitors flicker at 60 hrz while some can’t. In my IT job at a huge company, I have to go and visit people every day. I always point it out to people when they have their monitors set t0 60 hrz as opposed to 70 or above. It looks like very distracting flashing to me and others that can tell it instantly. Many of them can’t tell the difference even when I switch it back and forth. Oh well, I can’t hear those really high pitched sounds some people complain about.

I don’t think latency is the same as frame rate, however. If the jet fighters didn’t hit each other, the pilot would see the jet all the way in, he would just react after it had passed.

I suspect what some people are seeing, and some aren’t isn’t the 60hz CRT, but the beat frequency between the CRT and the fluorescent lighting in a typical office.
The critical flicker fusion threshold (cfft), the lowest level of continuous flicker that is perceived as a steady source of light, is typically about 20 Hz; see fig. 2. You can train yourself to do a little better than that, but 60Hz awfully darn fast for human flicker perception.

Do incandescant lights flicker? Back when I had a CRT at home, I could see the difference between 60 hz and 80 hz, and we don’t have flourescent lights.


Nope; I can see it even if the CRT is the only light source; in fact, it’s worse that way. Remember that this isn’t just a single, monochrome flashing light; it’s a detailed, continuously refreshed, often animated image – and on surface which covers much of our angle of vision, too.

Your CRT (you do still have a CRT around, right? Check a museum) may not be able to go this low, but if you can set it to 30 or 40 hz, I guarantee you that EVERYONE will be able to see the flicker, well above the point-source single light 20hz threshold described.

It would be interesting to cover all but a few pixels of a CRT screen and see if the apparent flicker went away – I suspect it would, but don’t have a CRT to try it on any more.

In my graphics classes (12-17 years ago), they always told us 80-90 hz was the threshold at which you could be pretty sure that even sensitive individuals wouldn’t see flicker any more on the screens of the day. Note that this is CRT related; LCDs use a different (non-refresh-based) technology and are usually limited to 60 or 70 hz. Even so, I can see “strobing” flicker in highly animated 60-hz scenes (video games…uh, research) if the LCD is good enough that ghosting doesn’t cover it up.

No, incandescent lights do not usually flicker. They produce light by making a wire so hot that it glows. If they are on AC current then the light is basically turning on and off at 120 times per second, but this isn’t enough time for that wire to cool down and stop glowing.

I can still see the difference between 60 hz and 80 hz as well. That doesn’t mean that at 70hz it looks jerky, necessarily, and I can’t pick out individual frames. But it just doesn’t seem as smooth to me, and it’s uncomfortable to look at. I found that eye strain and headaches decrease a lot when I increase refresh rate.

It’s not inherently the frame rate or else people wouldn’t be able to see movies.

Movies are shot at 24 frames per second, but each frame sits in front of the shutter as it opens and closes twice. So the frame rate of movies is 48 frames per second. No one sees a flicker there.

Television and video sources in the US operate at only 30 frames per second. Movies never look quite as good on television but in interlaced operation you get a half-frame every 60th of a second to reduce flicker.

Even so, 48 or 60 frames per minutes are more than enough to eliminate seeing the frames change.

Most estimates of the persistence of vision places it at 1/20th of a second. That’s what allows movies to work.

So a refresh rate of 60Hz or more should never be distinguishable as separate frames. Some other effect is operating.

High frequency flickering (e.g. 60 Hz) probably messes with saccadic masking of the visual signal. As can be demonstrated by waving your fingers in front of a monitor, you’re staring at a strobe light. So when your visual system starts filtering signals based on time and frequency (during saccades), it noticeably picks up different parts of the screen image at different points in the flicker cycling.

I was going to post exactly this, I think, but in smaller words.

A friend of mine walked in on me doing experiments with lights and framerates and so on (he could tell the difference between 60 and 80 hz in a completely darkened room as well, BTW,) and suggested it has something to do with eye movements. Sounded plausible to me, though in this contex that’s not saying much.


This isn’t strictly true. The nerve cells of the retina (which, despite being a part of the eye, are central nervous system tissue) do a considerable amount of preprocessing and filtering before the nerve impulses are sent rearbrain. The brain doesn’t have a single cortex involve in visual processing; rather there are several zones distributed around the after part of the brain that process various types of stimulus emitted by the retina. The primary visual cortex is in the very rearmost region of the brain, which is why a smack to the back of the head can often cause temporary problems with vision (thanks again for that Intelligent Design, God) but image processing occurs roughly from rear forward along two general paths called the ventral and dorsal streams. These process and relate information in very different ways such that damage to one area may cause little or no impairment in another; for instance, someone who suffers cranial trauma or a stroke may be able to perfectly identify parts of a face, but not be able to connect the person with the face or indeed, even recognize is as a face. Oliver Sacks’ essay “The Man Who Mistoke His Wife For A Hat” (which can be found in the book of the same name) details a patient who suffered neurological damage and despite being perfectly able to see objects was incapable of connecting them to concepts in his conscious mind.

At any rate, different types of visual information (motion, straight lines and simple features, colors, complicated shapes, et cetera) are processed at different rates before becoming available as a conscious “image”. The rate at which a person can recognize change depends on how bright the image is and how much “change” in the image occurs from frame to frame. The 1/20th per second quoted by Squink is about the lower bound for a frame rate in order for motion to appear continuous. The frame rate for film, which as Exapno Mapcase states, is 48 “frames” per second, but each frame is doubled, which serves to stabilize the image without oversaturating the retina, giving the appearance of flowing, continuous motion in most cases save for when the camera is panning rapidly, which results in a jerky, jagged image. (The Maxivison 48 format, which has a true 48 fps frame rate and active image stabilization, allegedly removes any sensation of frame discretion.)

It’s unlikely that you’d be able to distinguish individual frames of objects at a frame rate of 60Hz or above; this is just much faster than your brain can consciously integrate image data into the mental concept of an object. However, you can certainly see phenomena that occur at higher frame rates, although you may not be able to perceive it discretely. This is especially true with CRT monitors, because you’ll typically sit close enough that the edge of the monitor may be in your peripheral vision where sensitivity to change and light is highest, even though you can’t resolve specific images. (The advantage of rapid sensation of motion at the periphery of vision in nature should be obvious, while the need to rapidly resolve motion that is in direct view is less important than making a more comprehensive conception of objects and activity.) The effective rate of an interference pattern between a CRT and a fluorescent light, or your hand waving across the screen, may also be a fractional quantity of the two frame rates.

So, there’s no simple answer to the o.p.'s question, because the vertebrate eye just doesn’t work like a camera or film projector; it’s vastly more sophisticated than anything we can construct, much to the lament of machine vision researchers.


The ability to perceive motion through peripheral vision is not one most people think about. I have to admit that it never made an impression on me until I visited the Exploratorium in San Francisco. There was an exhibit using this effect which allowed you to discern movement solely through peripheral vision that wasn’t noticeable at all otherwise. I’ve never forgotten it. And I always notice it now.

Do you think Directors and Film Editors are able to percieve higher frame rates than, say, an average movie-goer?


I have worked in aspects of the motion picture industry for almost 30 years, and IME, high frame rates in motion pictures are perceived differently from refresh rates of CRT video screens.

Like other posters here, I can see a bit of a flicker on computer monitors at rates below about 70 cycles. On the other hand, I’ve seen 60 fps motion pictures (i.e. Showscan) and did not have the impression of a flicker, nor do I with regular movies, which as Exapno pointed out, are shot at 24 frames per second, but projected at 48 flashes per second. This no doubt has to do with the different ways in which the two methods operate.

However, other effects are noticeable in motion pictures, particularly to directors and other expert observers. Specifically, strobing, which is most obvious to ordinary viewers in the case of wheels appearing to move backwards, but also happens with many other forms of motion. Strobing is caused or exacerbated by the repeating of each frame twice, and happens when motions of objects across the screen (or the entire image in the case of pans) are so large that they appear discontinuous. It’s a sort of jerkiness that many people don’t notice consciously, but which can become painfully obvious in poorly shot scenes, once you’ve become aware of it.

Strobing is especially noticeable on larger screens, as in IMAX theaters, my particular field. As a result, IMAX filmmakers take special precautions to control their camera motions to avoid strobing.

Raising the frame rate (as opposed to the flash rate) reduces strobing, because even in fast moving scenes the jump between successive frames is relatively small.

Special effects wiz Douglas Trumbull, experimented with many high frame rates before settling on 60 fps for the Showscan system he invented. He told me that some strobing effects that were still visible at 60 fps were pretty much gone somewhere between 70 and 80 fps. He would have liked to run Showscan at 72 fps, but there were too many advantages to 60 fps: since it’s the same frequency as U.S. mains power, it’s easier to run motors and other components at that rate. Also, it matches perfectly with HDTV.

(FYI, Showscan and other film systems that run above 30 fps flash each frame only once.)

Maybe eye can’t tell anything above 60hz is “flickering,” but I can definitely tell the difference between 60, 80, 100, and 120 hertz on my high end CRT monitors. It has a lot to do with the projection technology, and the higher the better; the higher the refresh rate the lower the eye strain and headache (even if they were previously unnoticed).

A recent article (April 2007 - Movie in our Eyes) in SciAm described research on rabbit vision. They found that the rabbit eye passes about 12 different independent images from the eyes to the brain, each one sensitive to different types of information (shadows, motion, lines, etc.). It’s a pretty interesting article about vision processing.