I’ve been wondering for a while now about this, and the question came up today in the office: When you videotape a computer monitor, why do you see the lines from the refreshing? Also, if a videotape can see them, why can’t I? I hope some body knows, cuz I’m fair puzzled.
It has to do with scan times. A computer monitor does not refresh its screen at the same rate that a television screen does, so the television camera catches part of the blanked screen every time it captures a frame. Your eyes do not work the same way so you ignore the flicker.
When you see a non-lined computer monitor on a movie or television show, that is a fake monitor running at the correct refresh rate to sync with the TV or movie camera.
This is also called aliasing. It’s the same thing that makes the wheels on your car or on a wagon look like they are moving backwards when you know this isn’t possible given the direction the car is moving. Your eye can only sample at a certain rate. Let’s say the wheel moves 359 degrees between samples then it is going to look like it moved 1 degree in the opposite direction based on the two samples that your eye sends to your brain.
In this case the two monitors scan at slightly different rates which makes it look like a stripe is moving down the screen. To fool your senses you typically need to sample at twice the rate of the highest frequency your senses can perceive. I forget what this is for the human eye but since monitors scan at 60 Hz I would assume the human eye is good up to about 30 Hz or so.
Sorry, but the first sentence of the second paragraph is the only one in this post that is unequivocally correct. Voluble seems to have confused principles of digital audio sampling with visual media to which they do not apply. The only time that wheels appear to go backward is on film or TV, when the frame rate interacts with the wheel’s rotation rate. In real life it can only happen in the presence of a light source (e.g. a fluorescent fixture) that strobes on and off.
They eye does not “sample” at any rate. You are probably thinking of the phenomenon called “persistence of vision” by which images that change more rapidly than about 12 to 15 times per second create the appearance of smooth motion. But this is not “sampling.” They eye can perceive things much briefer than a thousandth of a second. The first silent movies ran at about 16-18 fps, which was enough to create the impression of motion, but from that day to this, each frame is projected at least twice to reduce the amount of perceived flickering.
But you can experience the effect a little bit by clamping a plastic fork, knife or spoon in your teeth and flicking it while you look at a television set or computer monitor (if they are the CRT type). Though your eyes work as they always did, the vibration of the knife aims them up and down slightly, and if the speeds sync or almost sync you will see waviness in the image.
I think (not certain) that LCD thin panel screens don’t exhibit this because each pixel stays on constantly. The pixel values get updated at 60 Hz (or whatever speed you set it to), but unlike a cathode ray tube the pixel brightness is constant when the image is not changing. I guess an LCD might exhibit it a little during the time the image is changing brightness, such as in a rapid fade, but it’d be hard to see. That’s why monitor flicker is a non-issue with LCD monitors.