A television (or computer, or general things with screens) sometimes flicker when they in turn are filmed, on the resulting video. What causes this? And how do they get around it for TV shows and films?
I believe it has to do with the difference between the screen refresh rate (60Hz was a typical value for a CRT computer screen) and the number of frames per second the camera records. (Typically 24-25-26, in that range). If you make sure the screen and camera refreshes/captures at the same rate or with a whole number ratio (2:1 or similar) the problem goes away.
But what do I know, I’m sure somebody will be along with a correct answer later on
That’s mostly it. It’s very similar to the phenomenon that causes a moving wagon wheel, car tire, or helicopter rotor to look stationary, slow, or backwards on film.
A documentary made on film showing a TV screen will typically have this problem, but a feature film can afford to synchronise the CRT screen to the same frame rate as the film, so it will be, I presume, 48hz for a 24 frames-per-sec film.
I believe LCD screens do not have this problem, as they are effectively progressive, and refresh rates are no longer applicable. Though I may be wrong about those details.
In the case of TV shows containing shots of TVs, there are ways to use fancy video gear to sync the refreshing of the cameras with the refreshing of the closed-circuit signal being fed to the TV they’re looking at. And one thing that TV studios have in spades is fancy video gear.
BTW, the term for syncing video signals to a master is called Genlock. It can also be used to prevent “rolling” when switching between video sources.
Television in the U.S. is 30 frames per second. (Actually 60 half frames.) Computer monitors have various speeds, with 60 Hz usually a minimum. Movie cameras film at 24 frames per second. Getting them to sync is a pain and usually more trouble than it’s worth.
Now that TVs are pretty much flat and everything is post-processed by computer, wouldn’t it be easier and cheaper to just put the image in in post?
You’re eleven days out of date. Prior to June 12, broadcast television in the US wasn’t quite 30 frames per second, it was 29.97 frames per second (actually 59.94 half-frames). That’s the US-NTSC standard, which was (finally) phased out. The digital ATSC standard which replaced it supports a variety of frame rates, either interlaced or progressive.
There are a couple of other techniques to get rid of roll bars. One is to use a camera with a 144º shutter. My Eclair NPR can do 145º, which I’m told is close enough; but I haven’t tested it. My camera also has a speed on the speed control that (is supposed to) allow filming a CRT. The other, more usual, method is to use a milliframe controller. With this device the camera operator can stop the scrolling and move the bar out of the frame. I’d need to get one of these to use my Aaton to film a CRT.
That does seem to be standard nowadays. Which can be good, but can also look stupid, when the character is watching a beat-up security monitor, yet you, the viewer, can see that screen better than any security monitor or low-end TV you’ve ever seen in person.
I’ve noticed that a lot of the time in commercials and on TV, simulated screens look hokey. There are some that look good (a lot of the phone commercials, for instance), but sometimes you can see actors looking vaguely at an imagined image. They’ll be watching something with action and their eyes/heads aren’t moving, etc. And that’s not even getting into image quality…