I’m a fan of movies from the 80’s and 90’s (such as When Harry Met Sally and Sleepless in Seattle) and I recently decided to start buying digital copies. Well usually services like Google Play offer an SD option and then a more expensive HD option, and I’m just wondering, are the HD copies of these movies made 20+ years ago actually high definition? Or would I just be paying an extra couple of dollars for nothing?
Provided it comes from the film itself then yes. Film is much higher definition than SD video.
Yes. My Blu-Ray of Casablanca is of much higher quality than I have seen it in any other format, and I’ve seen it 34 times, including pristine prints in period theaters with perfect projection. Blu-Ray blows it away.
Easily. I have a Blu-Ray HD version of Kubrick’s “2001: A Space Odyssey”, which was released in 1968, and the quality is phenomenal. Actual film is way more high-def than HD, and could be digitized to 4k or even higher resolution when the time comes.
Blu-Ray of 1080p is about half way to good cinematic quality. There is some argument that 1080p (aka 2k) is about the quality of a cinema with an old, worn-out print. SD doesn’t even rate. 4k is considered cinematic quality, although a pristine 70mm copy will probably outshine it. So, if it was good enough to show in the cinema in the first place, 1080p is still not going to approach the intrinsic quality possible, but will be vastly better than SD. That said, the quality of Blu-Ray authoring does vary. Not every film gets carefully transcribed from the master negatives. A really high quality transcription costs serious money and takes quite a bit of effort. But even a basic Blu-Ray of any cinema release will trounce an SD version.
Yes, basically the film negative is scanned at a higher resolution (to put it plainly)
Thanks for the replies! I honestly don’t know much about film vs digital so I appreciate the quality differences being explained.
I think what’s really mind-blowing is that there’s some old TV shows that are getting HD releases. Apparently back in the day, it was sometimes easier on the production end of things to shoot the shows on film and then go over them with a TV camera for broadcast. So now these days they can go back and rescan the original film and end up with an HD version of the show, although sometimes this reveals funny little mistakes with sets and makeup that the original producers overlooked because they never thought anyone would see it at higher resolution.
On the other hand, they weren’t produced for HD, like black and white movies or tv were not produced for color. Do I really need to see Barney Fife’s nose in HD?
The process was called “videoizing” and it made many of the older shows look better than they would have had they been videoed. Sitcoms unfortunately were videoed and many of them don’t look that good today (Different Strokes and Three’s Company are two clear examples of these. They both look “cheap” by today’s standards) . However, a few sitcoms (Cheers and The Odd Couple) were filmed and thus they can rendered into HD and they’ll always look fantastic.
Yes. Check out Buster Keaton’s The General (1927) on Blu-ray. Even though it has not been “restored”, the quality of the video transfer is simply superlative–everything is crisp and there is an abundance of detail. The difference between the Blu-ray and the various god-awful DVD releases is night and day.
Not all older movies benefit from HD, however; not because of age, but because of how they were filmed. Robert Altman’s MASH* (1970), for instance, still looks like it was filmed through gauze, even in HD.
Screen size makes a huge difference though; if you’re sitting 10 feet from your 40" 1080p LCD screen, most people can’t tell 720p from 1080p.
“2001” is also helped by the fact that it was originally filmed in 70mm. In fact when I first saw it, the theatre used a large curved-screen format that at the time was called Cinerama, kind of a precursor to IMAX, which was really a terrific way to see it. But yes, even old 30’s movies and even some from the 20’s benefit greatly from being rendered in HD if the print is in good shape.
Theoretically you need at least 4K, but the reality is that many theatrical digital projectors are 2K, and many digital masters are of that order of resolution. “Gravity” for instance, was mastered in ArriRAW 2.8K with 2K digital intermediates. The digital theater where I saw it was one of those “premium” theaters with a supersize screen and 3D and it looked absolutely wonderful.
You are quite correct about transcription quality. Some movies look better in 720p than others in 1080p; likewise, sometimes re-releases from the same source look far better than previous ones. The same quality differences could be seen in the days of DVD. IIRC, the first DVD release of “Titanic” wasn’t even anamorphic, it was a careless transcription with no special features and the letterboxing made it useless on a modern TV. The newer DVD release was a properly done multi-disc package, and the Blu-Ray version is superb.
But in order to see this difference, you’d need to project the Blu-Ray on a giant movie theater screen, right? A home screen running 1080p is going to look much more photo-realistic than a movie screen that takes a 70mm film print and blows it up to a 25-foot screen.
A 1080p image projected onto a 25-foot movie screen leaves you with 1/4-inch pixels. Not bad if you’re in the back of the theater, but not nearly as sharp as a film projector.
This is why 4k screen technology for the home is pointless. Unless you have a huge screen and sit right up close to it, even 1080p is probably more resolution than you need. Most people cannot see the difference between 720p and 1080p if they’re sitting 8 feet away.
You might check out this documentary sometime:
I’ve been rewatching Cheers on Netflix.
One of the interesting little bits of trivia is that all of the beer on the show was fake (of course, so they could do multiple takes without the actors getting wasted.) When watching on a 1980s fuzzy TV, you’d never notice.
Watching on Netflix in HD (or whatever resolution the original tapes were digitized at) the beer on the show looks distractingly fake.
Thanks for the suggestion! This has been an interesting discussion for me. It’s strange to realize how little I know about projection and film technology despite watching movies and TV all the time.
Remember that it was intended to be displayed on theater screens - and still show full color depth and saturation. Digital is still a lonnnngggg way from coming close.
Just as digital SLR’s cannot touch 35mm Velvia, and few of the people on this board will live to see the day a digital device that can match a large format camera loaded with Velvia (or any decent E-6 (“slide”)) film.
Slides were also designed to be projected.
p.s. - “large format” is film with dimensions of 4"x5" and larger. A 35mm image is 24mmx36mm.
p.s.s. - that 35mm size: Now you know where the big deal of DSLR’s with sensor size of 24x36 originated.
Does anyone recall the “I, Claudius” series (BBC/PBS)? It seemed to me that it had been recorded on videotape vice film. Is this correct? If so, I would suppose then that there is no way to view it so that it doesn’t look “cheap”. It was a great series (or so I thought).
A long way? The movie industry doesn’t seem to agree. The vast majority of theaters are now digital – I think the figure was something like 85% at the end of 2012, probably much higher now. So even if the original cinematography isn’t digital – and increasingly it is – most of the distribution and exhibition is, as well as much of the post-processing. The key here is the difference between theory and the practical limits of perception. Kind of reminds me of folks who think that no digital audio can ever compete with a scratchy LP, or that they can tell the difference between a tube and a solid-state power amplifier.
Granted, there are a few old-school directors who dislike digital cinema. They’re probably the same guys who believe film grain is an art form and own Macintosh tube amplifiers.
Same comments as above. A low-grain large-format film may offer higher theoretical resolution than most digital sensors, but where and why does that matter? I know several professional photographers and none uses film any more; in fact even discussing it seems quaint. Interestingly, Velvia was offered for a while as a cinematography film, but had little success except for commercials and special effects where its high color saturation was desirable.
No, the only “big deal” with full-frame sensors is so 35mm lenses have the same effective focal length and crop factor. Many professional digital cameras have much smaller sensors, like the APS-H form factor (around 28.1 x 18.7mm) used by all the Canon EOS 1D series through to the Mark IV, until it was recently superceded by the full-frame EOS 1DX. Other than focal length considerations, the sensor only needs to be large enough to achieve the desired signal-to-noise ratio.