In recent years, archive clips from the early part of the 20th century seem to be played at a realistic speed and often include dubbed in sound effects for greater verisimilitude. However, I distinctly recall when I was younger that similar footage when shown in a history programme was speeded up. The subjects having near comical speeded up movements. What was the reason for this and how come it wasn’t possible until recently to slow the footage down to a ‘normal’ speed?
Silent film varied in projected speed (“Robin Hood” was shot to be projected at 22 fps) but the majority of silents were filmed at 16 frames per second.
In the 50’s and 60’s that film would be run through sound projectors that were fixed at 24 frames per second.
I do know that many serious silent film clubs/societies tried to project their showings at the proper speed.
There have been threads on this, but I can’t seem to find one now.
Persistence of vision is a phenomenon whereby our visual processors perceive the illusion of movement when frames are shown at or above a certain speed. It was found that 16 frames per second afforded the illusion to most people, so films were shot at about 16 fps. (Undercranking – turning the camera more slowly to achieve a speeded-up effect when projected – and overcranking, which is the opposite were sometimes deliberately used by the camera operator, and also by projectionists.) With the advent of sound, the film had to be shown at a higher rate for sound quality. (Sound was carried on an optical track on the edge of the film.) It was found that 24 fps afforded good sound quality, so that became the standard. Decades later, many people just transfered the 16 fps footage at 24 fps, speeding up the action.
And it’s nontrivial to change the speed, if you don’t have direct control over the framerate. If you’re showing it on TV, you’re stuck with the TV refresh rate (you can’t just adjust the set of every viewer in America), and even in a theater, it’d be more trouble than it’s worth to change the projector speed for those scenes. So if you want to show a 16 fps movie at 24 fps with the correct speed, you have to either duplicate every second frame (which probably looks unnatural), or interpolate the missing frames (which is a ton of work, and probably not even possible without massive computing power).
Ah, so footage that I see in a recent history show will have been digitally treated so as to appear at a normal looking speed?
In addition to the above answers, humans perceive flicker at higher frame rates than is required to give adequate impression of motion. My reference gives 47fps as the point where flicker is no longer noticeable (though I have seen other sources which claim at varies with viewing distance). To accommodate this modern projectors show each frame twice to give 48fps, whereas 16fps films require each frame to be shown 3 times.
That’s one way. The other would be to project it at the frame rate it was recorded at, record the image, and add the sound later.
It was always possible to adjust things so it showed at the same frame rate is was recorded at. The two methods were to use a silent speed projector (the projectors we had in school had a slower rate) or to double print some of the frames. For instance, if the film was shot at 20 fps, you add four duplicates of some of the frames to make it 24. That was an expensive process, however. For general use, they just ran the film at 24 fps.