NTSC (Rec.601) is no longer used except for legacy applications. Rec.709 has provisions for 23.976, 24, 25, 29.97, 30, 50, 59.94 and 60fps frame rates, so telecine is not often required. Many “film” projects are shot at 23.976fps and can be shown directly on HD systems with no frame rate conversion, although frame aspect mismatch is handled various ways.
First time I watched it, it was quite jarring. Started off just being a very sharp 4K movie. Those were mostly interior shots or a bit subdued lighting.
Then the bright daylight sequences kicked in. I was immediately taken by the realism. It took me out of the movie. It distracted me to focusing on how different it was. Different from a movie. So like just really seeing it on a bright sunny day. I liken it to my first viewing of a BluRay of Planet Earth on an all new system. The difference from DVD to higher res BluRay was so noticeable that it was a distraction. But a great one. I commented on how sharp and readable the anti piracy message was.
Now I see how most DVD movies are less detailed. The difference between BluRay and normal rate 4K is noticeable in many movies, but not so much as to distract. It is likely due to the original filming/capture method/resolution.
I watched Gemini Man again after reading this thread. I noticed that I had already become more used to the effect. Of course I expected it. I have watched a lot of quality 4K movies now.
It seemed to me that lighting had a very big effect. Bright daylight almost created the soap opera effect. But the color saturation and definition almost completely negated it. Leaving the more super realistic effect front and center. Muted indoor lighting or similar levels, seemed to leave it at a nice sharp 4K level. Not distracting at all.
But a few scenes that were very dimly lit still had a very distracting effect. A very flat contrast, muted gray scale look. Definitely the soap opera effect. Like black and white VHS contrast levels, but with 4K definition. Very odd.
Apparently this film had some off-screen hurdles to it’s making. So they may have decided to not re-shoot or digitally enhance them. It seems it was experimental for Ang Lee.
The upshot for me is that I become used to the format quite quickly. But if there is a particular variance within the movie, it is jarring. I am in my early 60’s. I have always been a techie. So I am used to getting used to the new thing. But it has to be consistent. Even in ways that I would not expect myself to notice.
I was wasn’t aware Gemini Man was shot in high frame rate, and only knew it as a poorly reviewed Will Smith vehicle. Apparently it was shot in 4k 120fps, which is a crazy format for general release in 2019. Movie theaters said that test footage crashed the projection gear.
This high frame rate stuff puts me in mind of Douglas Trumbull, who’s been flogging Showscan, a 70mm 60fps film format that would supposedly be “hyper-real”, and he’s has been peddling this dream since the 70’s.
Sadly Trumbull’s instrumentation couldn’t register that response as “revulsion”.
Even at 60 FPS on the 4K Bluray, it is noticeably different. I didn’t notice any difference on The Hobbit. But maybe I don’t have the right discs.
It was just an average action flick. But when I re watched it after noting the reviews. I am inclined to give it a pass as to quality of overall plot/story line. It is a more one to one character story, with a very big backstory. It would have been a much longer movie to flesh out the complicated backstory. But it might have been worth it with good writing.
A television company would only broadcast at a particular frame rate, so regardless of your TV’s frame rate abilities, if the broadcaster is sending out a signal at 30fps that’s what your TV is displaying - albeit probably at 60hz. And will still have that odd telecine technique where some of the frames get displayed for longer than others in each second.
That could be an effect of high dynamic range cameras. The purpose is to capture as much detail out of a high-contrast environment as possible without bright areas clipping or dark areas being completely black. This leads to a seemingly low-contrast image with poor color saturation straight out of the camera, but it gives them more control over those adjustments in post production, and it can make green screen compositing easier as well. I remember seeing that muddy look on news broadcasts in the early days of HD, and I assume there’s automatic/algorithmic adjustments that are applied to a live stream, but I suspect they hadn’t dialed in that workflow yet.
I believe you’re referring to logprofile recording, and yeah it looks like heck if uncorrected but I’d doubt a major motion picture production would be unaware of this.
It’s interesting reading these various explanations for the “soap opera effect” because back when I was a kid, it was extremely noticeable if a show had been videotaped rather than filmed, and I could never put my finger on quite what the reason(s) were. Even my mother noticed, disdainfully dismissing some evening drama by saying “it looks like a play” (as Chronos noted upthread). Note that in those days of course “videotaped” meant analog video recording, not digital. Even the audio seemed different somehow. My best guess is that videotaping was associated with lower costs, and those were carried through to things like cheap basic lighting and little or no editing or other post-production processing.
It also seemed to me that on those old analog TVs the videotaped stuff actually looked sharper than the filmed stuff, and that this was part of the difference. But obviously that’s hardly the explanation, since today I can watch a movie in 1080p and it looks exactly like a movie and not one of those old videotapes, even if the movie was originally recorded digitally. It simply looks and sounds like a movie with pleasantly high-resolution imagery and high quality audio.
Was there something more than the difference between tape and film? Because daytime soap opera on tape still looked and sounded very different from prime time television on tape.
Major television shows back in the day (e.g. “Cheers”) were generally shot on film because of the better contrast ratio of film (blacker blacks, brighter brights) then telecine’d to video frame rates and broadcast. Cheaper shows (e.g. “All in the Family”) were shot on video tape and suffered from videotape’s flat look.
Also, before the invention of the first commercially successful videotape recorder by Ampex in 1956 and its widespread proliferation, many TV shows were either broadcast live and/or rebroadcast in different time zones as low-quality kinescopes, although some were filmed. Two famous examples were I Love Lucy and just 39 episodes of The Honeymooners.
Both filmings were notable for different reasons. I Love Lucy was filmed (in front of a live audience) because, although CBS refused to pay the extra costs involved in multi-camera filming, Lucy and Desi took the costs out of their own salaries on the condition that they owned the syndication rights. Otherwise the higher quality live broadcasts would have been geared to the more populous eastern time zone and they didn’t want to move to New York. CBS was happy to oblige because, they thought, who would want to watch a sitcom more than once, and saw syndication as being of little value! It went on, of course, to become probably the most syndicated show in television history.
The Honeymooners was notable for the 39 episodes that were simultaneously broadcast live and filmed using a proprietary Dumont video+film camera system. To me these films have more of a video feel to them, but that and I Love Lucy remain prime examples of old TV shows that have been preserved in high quality. Even after the advent of videotape, many old shows were lost because the large 2" wide videotapes were expensive and frequently re-used.
From childhood, I found that experience of watching a daytime opera unique, very unlike other taped shows, unlike taped sitcoms, unlike taped public television, unlike taped local talk shows, unlike anything else. So, in my view this can’t possibly be an end to the matter.
Could just be cheaper camera and editing equipment. I’m not sure what the editing tools were like in the 80s and 90s, but they were only just starting to use computers, so I’d imagine that the editing process wasn’t entirely lossless besides perhaps simple cuts.
In the 80’s they were certainly doing linear video editing, basically using multiple playback machines recorded onto another machine using an edit controller and an edit decision list. However, non-linear editing, which started in 1971, finally became widespread in the 90’s due to reduced costs for storage and high-capacity and faster I/O on off the shelf computers. Hardly moviolas.
OK, I’ll grant you that: soap operas have (had?) a certainly “look” that wasn’t better or worse quality than other video, but definitely recognizably a different look than any other production. You could see 1 second of video and know you were seeing a soap opera and not something else. I don’t know what gave it that quality, but I’m going with the way the actors and sets were lit.
I saw the 120fps Dolby Cinema 3D version (apparently only **14 **theaters nationwide were able to show it in that format) and I was stunned by the visuals. What was most notable was that you could actually *see *all of the very fast action sequences! Fast camera movements and panning shots following Will Smith as he jumped across buildings and raced motorcycles around winding streets were no longer a jumbled blur. You could make out every detail in the frame and it really felt like you were there in the scene.
I confess I was no longer involved in the Hollywood film industry by the 1990’s, but I think you may have jumped the gun a bit. Professional movies were recorded primarily on film until ca. 2012, when the industry switched over to digital (50/50 in 2012). I would hate to do non-linear editing on pre-2000 computers unless I had years to kill while waiting.