Why Do Computer Monitors Flicker on Video

I’ve noticed when watching TV or a movie, that computer monitors often flicker. Why? I sure hope someone else has noticed this phenomenon.

I’m not sure (or in other words I’d like to see this answered too) but I think it has to do with the monitor’s refresh rate (in Hz). It’s probably different that the refresh rate of the video camera.

Kinthalis has it. Film runs at 24 frames per second; computer monitors refresh differently. So they don’t sync, creating the flicker. If you keep an eye out for it, you’ll notice in most films they don’t flicker – most cinematographers will hook up a film/video sync box which eliminates the flicker.

It’s a syncronization issue.

The picture on a CRT(this includes television monitors)is made by an electron gun at the back of the CRT shooting a beam to the front of the monitor where you see the picture. Flat panel displays don’t use the same mechanism, but the construction of the picture is similar. Making the picture requires the gun to ‘shoot’ twice at the screen. The gun shoots 50% of the picture in it’s first pass from top to bottom, then goes back to the top and shoots the other 50%.

The gun is shooting lines, maybe you’ve heard of the term ‘lines of resolution’ which was a measurement of picture quality in the analog days before people used ‘pixel count’ as a resolution measurement. The CRT’s electron gun shoots each other line per field. Two fields make one picture. The first pass it shoots the lines skipping every other one, the second pass it fills in the lines missed by the first pass.

The pick-up devise in the camera shooting your computer screen works the same way. What causes the picture roll or ‘flicker’ is that the gun producing the picture in the CRT is not in the same orientation as the ‘gun’ in the camera thats taking the picture of the CRT.

There are ways to syncronize the monitors and the cameras, and I’m suprised you see this in movies. The flicker is noticible on news reports because the people doing the news stories don’t have time to sync the cameras. But if you watch a show where they take a camera in the control room, such as an NFL broadcast, you’ll notice that all the monitors and cameras are in sync, producing no picture roll.

The frame rate in the US is 60 field per second(corresponding to 60 cycles), 30 pictures per second.

I’ve not shot a monitor or television; but as Anamorphic says, there is a “synch box” known as a "milliframe controller. This allows the cinematographer to set the frame rate on his or her camera to, as the name implies, 1/1000 second.

If you’re filming at 24 fps, you can also use a 144° shutter. My Éclair can be set to 145°, which is pretty close. It should minimize the problem; but as I said, I haven’t tried it.

Most cinematographers use the milliframe controller. Professional video cameras have a similar device built into them, but I don’t remember what it’s called.

Poop. All the questions I know how to answer have been well and thoroughly covered by the time I show up. Slow down, you guys! Give some of us a shot, willya? :slight_smile:

This is not 100% correct. What you’re talking about is interlaced video (contrast “progressive” video), which is used for most TV broadcasts (the exceptions being 720p HDTV transmitted by ABC and 480p “enhanced” used by Fox). CRT computer monitors are, for the most part, non-interlaced, which means they repaint the entire screen for each refresh cycle. The flicker is seen because the phosphors on the screen fade quickly once they’re refreshed.

Flat panels (and this applies to LCD and Plasma)are completely different; there is no refreshing going on since the image (or, I should say, the state of each pixel) is held on the screen by transistor switches. Therefore, until a particular pixel changes, it doesn’t need to be switched, so, no flicker. While I haven’t tried it, I’m fairly certain that you wouldn’t see any flicker at all with a flat screen device if you recorded on film or video tape.

A little empirical evidence, anyone?

I’m using an Envision 17" flat panel monitor on my computer. I got out my Arriflex 16St, which is equipped with a rotating mirror shutter. This means that I see what is going onto the film. There is some flicker in the viewfinder because the mirror is only sending images to it half of the time; the other half of the time the light goes to the film. I have shot a television screen with it before, and the scroll bars are obvious. It is also equipped with a Tobin crystal-synch motor.

A few minutes ago I shot my monitor with the Arri at 24 fps, 30 fps and 20 fps crystal-controlled speeds.

As Running with Scissors says, there are no scroll bars to be seen.

I’ve learned something today. :slight_smile:

Just a quick comment. The flicker you see on Computer monitor’s on TV is just like when you sometimes see the wheels of a car go backwards. The Camera will take a picture at close to the same time in the Monitor’s refresh cycle (like a strobe light, the second picture might be taken two or three or more cycles later) so you get the almost the same picture twice (so you see a dark bar, because half the image is darkening) And the dark bar moves because the frequecy’s arn’t identical, so a slightly different area of the monitor is refreshed each time.