In older movies,if a tv was on during a scene,the picture would always have a horizontal roll/flicker.
I remember reading an article back then about the phenomenon which explained the cause,but have forgotten it.
But that’s not the subject of my question.I was just channel surfing,and in a scene from the Wonder Boys,one of the characters is watching tv and the picture is perfect.Thinking about this I’m wondering how the movie tvs changed from bad reception to normal.
Anyone know?
And to answer your other question, a perfect tv picture is a special effect. Instead of filming an actual television they insert the tv picture into the frame after the fact.
Thanks,** Exapno ** that’s what I thought.That scene from the Rooney/Garland b&w movie he was watching seemed too clear,to me,for ordinary tv reception.Looks like they took the movie and enclosed it in a tv with no innards.
Sorry, Exapno, but that’s not true in most cases. While some TV images are inserted via optical printing or other methods, in most cases they convert the TV signal to 24 frames live on set and sync it to the camera to eliminate the flicker. Check the credits in movies for “24 frame playback.”
True enough but even the same frame rates can produce flickering. Take any video camera and record a TV program with it. You will see the picture flicker. That’s because the frames are out of synch with each other.
On TV, such as newsrooms where monitors are in the scene, they use special equipment to synch the cameras so they scan at the same rate as the TVs.
Actually I think an interesting phenomenon of perception is that nowadays a TV picture in a movie, or another TV show, now seems not merely “normal” but “supernormal”, as if the characters in the movie were watching 3D-TV.
I suppose it’s the result of getting so lost in the surrounding movie’s reality, and there being no difference between the movie and the TV-show within the movie. They’re both 2D.
Sorry that wasn’t clear, Exapno. I should have said that the 24-frame synchronization is a somewhat new technology. Back in the 30’s and 40’s, screens were done with optical inserts as you describe.
Small trivia note: You can see a raster flicker in the “help me Obi-Wan Kenobi, you’re my only hope” hologram in Star Wars. Lucas, back when he was young and clever, took advantage of an existing flaw and turned it into a feature, creating a high-tech look for the hologram by exploiting a low-budget aspect of TV-to-film conversion. Once you realize Princess Leia’s image was basically just filmed off a TV screen, it’s obvious, but you don’t think about it until it’s pointed out.
I should have thought a moment longer and realized that everything is done with technology these days. But I didn’t think it was quite as old as the 30s and 40s.
It was just as possible to film a television screen without flicker in the 1930s and 1940s as it is today. Kinescope films were films made from a television picture tube (a “kinescope”), and were how television programs were recorded before the availability of a practical videotape system in 1956.
Interesting. I was recently viewing the commentary for Gerry Anderson’s UFO series, in which they rigged up some device for synchronizing the film cameras with the TV displays that feature so prominently on the sets (we wouldn’t want our view of the purple-haired bimbos on Moonbase to be impeded, would we?) Gerry Anderson’s commentary strongly implies that they had to invent the technique pretty much from scratch - is this something that’s been re-invented several times, by different production companies? (UFO is early Seventies stuff.)
Yes, I remember the commentary track for Star Trek II mentioning something about a new technology allowing them to run displays at 24 fps. I think filmmakers have short memories.
Also, you can use an LCD screen attached to a computer or to display video footage and film that.
I don’t know how it works, but I’ve made several films and shot a computer monitor playing video with a 35mm motion picture camera and a MiniDV camera and there was no flickering.
Apple Computer, as I recall, once supplied upon request a special “Control Panel” system software control that would allow the user to adjust a computer monitor’s frame rate. So it’s not like you have to use a torque wrench to the back of the picture tube to make the adjustment, or anything.
Just FYI,
Ranchoth
(They undoubtedly used this same system to ensure high-quality filming of the quicktime movie “video phone”-Complete with “play/pause” button-seen in Jurassic Park.)