Why does a computer screen roll and flicker when videotaped?

Okay, I’ll be the first to admit that I don’t know a damn thing about how TVs and monitors work, so if this is a stupid question, feel free to let me know.

I do know that there is some kind of camera filter available that will prevent the problems, but I want to know why this phenomenon occurs in the first place. How can a video camera see something that is obviously not there?

The information on a video screen is actually being repainted a gazillion times a second. Depending on how fast the phosphors lose their glow, if you capture an image it might be blank. If you capture a series of images (such as with another video camera), depending on the timing, they might all be blank, or they might seem to be scrolling slowly up or down.

This is the same phenomenon which caused wagon wheels to turn backwards in westerns.

In an nutshell, the phenomenon you describe is caused by the refresh rate of the monitor being out of sync with the frame rate of the video camera. The screen, IIRC, refreshes from top to bottom and there is a scan line that moves down the screen at, say, 75 cycles per second, or whatever the refresh rate of the monitor is (check your control panel.) This is too fast for your eye to pick up (though the lower the refresh rate is, the likelier you are to see a little flickering, and this also depends on the way you’re looking at the monitor.) But if the frame rate of the video camera is out of phase with the refresh rate, it will pick up the scan line. Sometimes I see this on the news when they show a classtroom full of computers and they’re all flickering. A similar effect can be observed when you look at a turning wheel under a strobe light (or even the spinning hubcap of a car if conditions are right)-- you can sometimes observe the wheel appearing to turn more slowly than it really is, or even backwards.

I hope this at least partially addresses your question.

Ah, that makes sense.

Cool, thanks y’all.

you get the same effect with still cameras. so how do you grab stills with film? simple. you do a longer exposure (1/15 of a second or slower) and your scanlines disappear. that way you don’t catch the image mid-scan.

What Malden said. Your TV (and the TV cameras that film content for it) work at 24 frames/second. Computers are significantly higher but more importantly, out of phase. 35? 50? Can’t remember the number. It’s effectively a strobe light.

hold on…

actually,

i’m a bit off. you asked about computer screens. they function a bit different than TV screens (which is what my answer refered to. Computer screens refresh quicker), so to photograph them you have to do it for 1/2 to 1 second to minimize the “banding” problem.

Additional info on flickering monitors and TVs, and what causes it, can be gleaned from this thread: TV on TV

Mind if I add some comments? More than you want to know, probably…

Ethilrist wrote:

A slight exageration. Refresh rates, depending on equipment and modes can range from around 30 to 90 cycles per second (possibly higher).

When you record, your video camera also has a refresh rate that may or may not be at a similar frequency, but nevertheless will rarely be synchronized.

The reason we rarely notice this phenomenon is that even at 30 cycles per second (interlaced) a principle known as persistence masks it out. When photons strike our optic nerves, the nerve endings are desensitized for a period of time. The period is partly a function of intensity, as demonstrated by the dots you see immediately following a camera flash.

The reason you can see it with the video camera is that the lack of synchronization sets up a stroboscopic effect. This stroboscopic effect creates a slowing, stopping, or even reversing appearance. It’s essentially the same principle as how a strobe light at a disco makes it look like people in motion are temporarily frozen.

There are other ways to see this effect with TVs and computer monitors. One very common one is the flicker or shimy you might see (especially with fast refresh monitors) when exposed to florescent lighting. Florescents also have a flicker which can modulate the CRT image.

Another one is try putting a fan or other rapidly rotating image between you and your TV. Usually you can see the fan blades appear to stop or rotate slower or even rotate in the opposite direction.

Finally, you can also duplicate this effect with just your hand. Hold your hand out in front of your TV or monitor with your fingers spread wide - flat side of your hand facing you. Now rotate your wrists so that your fingers sweep across your screen. You should see a stroboscopic effect, making it look like your fingers are stopped in motion.

You’ve seen the same basic effect on the old cowboy movies. The stagecoach going forward with the wheels turning backwards.

Joey did a fair job there :slight_smile:

Now then. I’ll address the O.P. first, then branch out a wee tad here. I live this shit. Video camera are indeed capable of eliminating the roll bar these days. The two main producers of broadcast quality video cameras are Sony and Ikegami. Both models, for years, have allowed the operator to seek a matching cycle. For example, I am shooting Joey Baggadonuts, and behind him is an NEC MultiSync monitor. I frame up, and then while looking through the eyepiece, scroll through the various sync rates, i.e.- 66.8 mHz, 70.0 mHz, etc. As one scrolls through, the bar on the monitor will turn to a flicker, then voila. You will have hit the matching refresh rate for that PARTICULAR monitor. You do your shot, all looks lovely. HOWEVER- if you have to PAN the camera, the image will shudder as you do so. Refresh is actually a slanted action, the retrace line moves in a diagonal axis across the inside of the CRT. So, if I pan, I get a shudder. Oh well. :slight_smile:

Now, to beat a question or to answer before it’s asked. When you see a t.v. show where there are LOTS of monitors in a shot, the odds are incredibly high that they are matched monitors. Either by brand and model, OR the individual PC’s have been run on a refresh program. I used to have one, a bazillion years ago, before you could actually chose the rate on the face of your monitor. I can remember saving the day with it. ( This is pre- sync by camera days). Additionally, if you see a shot in a t.v. show with lots of monitors,and it’s a dramatic show, the odds are excellent that it was shot with a film camera, NOT a videotape camera. We’ll get to that in a moment.

The real problem is the real world. I have done shots on the floor of the NYSE. They had monitors there I’d never seen in my LIFE before. C’est la vie. I left the camera on Pre-Set ( 60hz, 30 fps ), and let them all roll as they would. You can only chose one rate for the camera to match to.

Now, to film cameras and t.v. shows. Shows such as Lou Grant, etc where you are in an office FILLED with monitors ( newspaper, police station, etc). Depending on the film camera body, you have to have a mechanism by which you can adjust the frame rate by fractions of a frame per second. In the older days, you had an add-on box that would drive the camera by said fractions. Videotape sync for a film camera is NOT 30 frames per second. ( Videotape is recorded by a video camera at 30 fps. in the united states, at a 60 Hz cycle). When shooting a video image with a film camera, you shoot at 29.975 frames per second. However, the world is filled with funky monitors. NTSC ( the broadcast standard in America ) rolls at 30 fps. BUT- that’s just if you have a t.v. set in the shot. If you have a computer monitor, you have to set the film camera, in a way similar to that of a videotape camera. Newer generation film cameras made by Arriflex, Panavision and Aaton tend to have software built in that permits “phase shifting” to occur as you roll. This way, the first few feet are exposed as you adjust the fractional frames per second shot, until the percieved roll bar disappears from the eyepiece. Then of course, you leave the camera rolling, and do the take.

I shot a commercial for ESPN last summer,and FINALLY got to use a camera with a true “smart box”. It remembered the off-speed sync rate, even after you turned the camera off at the end of the shot. You just had to program it once, then you were good to go. Unless you cut power to said smart box. :frowning: Then you had to re-set it.

Hope this has been at least a little bit helpful. Oh, and the flashing of spread fingers in front of your eyeballs, and the wagon wheel spokes is less a sync issue and more an issue of stroboscopic effect. <sigh> I can’t do that one now, and besides it’d be a major-league hijack.

Cinematographically yours,

Cartooniverse

Not quite. Cinematic film is 24 fps. NTSC (i.e. North American TVs) is 30fps. PAL and SECAM (most of the rest of the world) are 25 fps. Most newer PCs refresh video at 72, 75 or 85 Hz.

Occasionally I’ve been able to get a stable picture on a video camera if I set the PC screen refresh to 60Hz.

Original quote by Cartooniverse

Dont mean to hijack here, but I dont suppose you shot the one where the sportscaster was throwing baseball bats from a helicopter? I always wondered how they did that.