When I saw the Hobbit movies in the theater at 60fps, I got the disconcerting feeling that I was watching a play. It looked kind of like real-life, and the sets looked fake. Trees looked like foam props, etc. I watched the same movie at 24 fps and everything looked fine. The “fakeness” at 60fps, didn’t really seem to occur during high speed action sequences as much, so I question that it was really about motion blur.
I know that digital filming has progressed to the point that the untrained eye (mine, at least) cannot distinguish between film and digital. There seems to be some sort of digital noise, or “film grain” added, and of course it’s at 24 fps and has accompanying motion blur because of that.
I really like the look of film, and I think most people do. My only complaint about film is when they do “shaky cam” or fast pans, which makes things look like a blurry mess. What makes “film” (or digital that is aping film) look like film? Is it possible to marry the two, where it can look as aesthetically pleasing as film, without the blurryness?
That’s the “Soap Opera” effect that made tape look so much different from film back in the day.
A combination of motion blur, film grain, color, and lighting.
I think you have figured out the first two, but don’t discount the others.
Colors are represented differently by all media–what we see with our eyes is very different from what is captured by film or by sensors. All color film and sensors are merely approximations, with optimizations added to emphasize colors that make things appear more natural (if that’s the goal). For an interesting side effect on this, google “Shirley Cards” for a weird racism angle in film labs.
So, if you have all kinds of cool film stock that is tuned to show skin tones in one way or another, and then switch to digital sensors, things will look different. These days they can make the sensors imitate the film, but that is an approximation.
Lighting has more to do with the special needs of low-cost shooting for soap operas, where they couldn’t afford to reset lights all the time, so they went for a good generic one-size-fits-all setup.
FPS/frame rate has nothing to do with motion blur, that’s a factor of shutter speed. In a still camera, shutter speed is increased as available light is increased, so if it’s very bright outside you can have a shutter speed of 1/1000 second or less, whereas indoors with low light it could be something like 1/60 second. The latter will have more motion blur than the former. In a film/movie camera running at 24 FPS, under bright light it will take a 1/1000 second exposure every 1/24 of a second. If it’s a fancy 8K digital RED camera shooting at 120 FPS, under bright light with a similar lens, it would take a 1/1000 second exposure every 1/120 of a second. Captain Disillusion explains it better than me in his video about laminar flow: Laminar Flow DISAMBIGUATION - YouTube the frame rate/shutter speed explanation starts at 5:10.
Now, that doesn’t mean motion blur isn’t a factor in the different looks, it’s just that it’s controlled deliberately by the camera operator and/or director independent of the frame rate. It’s also very important in the visual effects to make sure they look right.
That said, the frame rate does have a big impact on the look. Since film has been 24 FPS for so long, it’s what we’re used to seeing from movies, whereas TV/video/tape was 60 FPS (technically 30 interlaced), and generally anything shot directly to tape was of lesser quality, think sitcoms, news, game shows, and of course soap operas. Despite having much smoother motion and no grain, because they’re not movies, they’re perceived as being worse. They also can’t be “re-scanned” into HD because the amount of detail is fixed at 480p. Technology Connections has a good video about this: Film: the reason some of the past was in HD - YouTube
So the frame rate and grain, or lack thereof, are the biggest factors making them look different, but as minor7flat5 mentioned, color and lighting are important as well. Different film stock responds to light and colors differently, but video cameras in the past had some crazy lighting and color response quirks that had to be worked around. In the days of black and white and early color, with bright even lighting required to get any sort of decent picture, the sorts of makeup needed to get proper representation was borderline insane. Think green lipstick and clown-like highlights. Cosmetics and Skin: Max Factor and Televison Even as those problems were gradually fixed with better technology, the fast shooting and quick turnaround required for TV productions meant nice even lighting was required, again as minor7flat5 mentioned, which contributed to the differing look.
Yeah kinda. The real interesting question is, if you take a film like The Hobbit or Pirates of the Caribbean, which look like plays, or actors acting in costumes, and you downsample it to 24 FPS, why does it suddenly look ok again? Maybe it’s a sort of uncanny valley situation, where 24 FPS looks unreal enough that you can still be immersed in it even knowing that it’s fake, but at higher FPS, especially at resolutions like 1080P or 4K, it gets too close to reality so you’re actually taken out of it because your brain can’t handle the fact that it’s on a screen and not actually there. Like it’s too real to be comfortable, but not too real enough to climb out the far side of the uncanny valley. That’s just a theory on my part, but it does seem like there’s more going on than just the soap opera effect.
When I shoot video I am generally going for a “film” look. So I try to get the right exposure settings for the scene, film in light that is suitable (not at high noon on a sunnny day). But my secret is I usually shoot with a gimbal which steadies the camera, like a steady cam. Nothing like a shaky camera to spoil the mood of a shot, or jarring transitions during a panning shot. I also have a choice of fps and usually choose 24.
The other reason is that video tape has a worse contrast ratio than film; blacks aren’t as black, bright aren’t as bright as film, so the whole thing takes on a very ‘flat’ look (yes, in addition to the soap opera flat lighting).
IIRC the other issue is that too high a shutter speed will give a stroboscopic effect with fast motion. In that situation, some blur is called for, rather than a sequence of sharp images.
Fun Fact - the scenes in the original Star Wars whee the camera follows the fighters down the trench - It was done with a camera taking one frame shot at a time as the fighter models were re-positioned in the trench model. They deliberately moved the camera during the fairly long exposure of each frame. This achieved a small amount of blur, especially at the edges of the picture, which made the motion look realistic.
I agree, lighting is a lot more nuanced in cinema productions. The lighting people and the director spend time looking for just the right degree of highlights and contrasts (even to the point of having stand-ins for the tedious job of adjusting the lighting, which suggests it takes a while…)
“Mexican Soap Opera”? I think (as a Canadian) that Hollywood has the best professionals, and you only have to see stuff made locally in other countries to realize how much professionalism goes into even a cheap soap opera from Hollywood. Whatever produces the “soap opera effect” is more pronounced in cheaper productions elsewhere.
One effect of really high resolution (as opposed to high FPS) might be the ability to more clearly see things that were “invisible” before.
For example, my dad switched to a high-resolution digital TV and started noticing little details in news broadcasts that couldn’t be seen in the past, such as obvious makeup, skin pores and hairs, other little things we know are there but hadn’t really been able to see before. He said it actually made the newscasters look bad, rather than improving the overall experience.
It’s possible to change the shutter speed to increase or decrease motion blur. It’s possible to change the frame rate independently (at least during recording; movie theaters might not have that flexibility).
One of the characteristics of film is the 24fps spec. Compared to 30fps – the nearest thing we have to a video standard now – 24 looks jerky, especially for action scenes. I have been bothered by that ever since I saw my first film in a movie theater, and when 30 came around, it felt like a big improvement.
Your comment about liking the look of film reminds me of other entertainment processes as they have improved over the years. When dbx noise reduction was first demonstrated, some critics said it didn’t sound natural because they couldn’t hear the tape hiss and “natural” distortion.
When CDs became available, some people said it didn’t sound right because they couldn’t hear any surface noise and the dynamic range was too great.
There has been similar criticism of high frame rate film. It doesn’t look “natural” because you have been so used to overlooking the technical flaws that when they aren’t there anymore, it bothers you.
Mexican soap operas have traditionally used motion smoothing / motion interpolation which gives the picture a very artificial and overly processed look. You can see it whenever the camera itself moves. I’d describe it as being like the uncanny valley effect where it’s so smooth that it’s just jarring to the viewer (if they aren’t used to the effect).
Newer TVs have motion smoothing and it drives me crazy. It’s the first thing I turn off in any new set. If you have a TV with a motion smoothing function, turn it on full blast for a while and see what you think.
Or turn on a Mexican soap opera and see if your brain can handle it without going “man, this is weird!”
Implicit in this question is the notion that a play looks “less real” than a movie. But of course that’s absurd: A play is real. Movies look fake, and you’re used to movies looking fake, so when they don’t, it’s jarring to you: Not because it doesn’t look real, but because it doesn’t look like a movie.
I don’t think it’s a factor in the effect the OP is asking about but it is one of my favourite oddities about movies on TV… film is usually recorded at 24fps. Most TVs around the world broadcast at 25fps. So if you’re watching a movie on a PAL system you’re probably* seeing the film at a 4% faster speed than you would do if sat in a cinema.
*I say probably because some telecine techniques remove a frame for each second of footage. But not many as it’s a really crappy way to convert…
Actually the US TV system is even odder but I can’t describe it very well. The US uses NTSC which is 30fps. Perhaps this might well explain why movies look slightly different on TV in the US, although possibly still not responsible for the effect the OP describes…
Also NTSC in color is 30000/1001 or 29.97002997002997etc fps. When the conversion from black and white to color happened, a color signal was injected into the 30hz TV broadcast interval in exchange for a slightly lower frame rate. This allowed black and white televisions to continue to function with a color signal. The trade-off was extremely poor color fidelity.
The current standard for HD television is Rec.709 with much better color fidelity and a variety of frame rates.