I'm watching an old movie in HD

Unforgiven (1992). Smokey and the Bandit (1977) was on TV the other night in HD. How are these old films in HD when the technology wasn’t even out when they were filmed?

The equivalent resolution of movie film emulsion is probably theoretically higher than the resolution of, say, DVD - so as long as a decent quality print of the movie still exists, it can be scanned and converted to HD.

Because they were filmed on 35mm film, which has a much higher resolution than anything you can see on television. So the high resolution was always there, you just couldn’t see it because that extra information was discarded when you watched it on your TV.

HDTV at 1080 is 1920 x 1080, giving a surprisingly low 2.1MP. 35mm film has much higher effective resolution than that.

Thanks for the replies. So would someone have to specifically redo the film for HD. Or could you break out Buster Keaton and Charlie Chaplin’s old reels and watch them in HD?

Resolution in various media.

Right. Decent 35mm images have at least four times as much information as HD. HD=1920x1080 pixels. The current standard for digital cinema projectors in theaters is roughly equivalent to HD: 2048x1080. It is called 2K, based on the horizontal resolution.

Most cinematographers and other Hollywood technical experts think that 2K is “good enough” to reproduce a 35mm frame, but that 4K (4096x2160 pixels) is needed to capture all the detail that the best 35mm emulsions are capable of resolving.

When movies are transferred to video they are scanned and converted to digital files. Any film that’s been released to DVD in the last ten years or more has been scanned at 2K resolution (at least) and in the last few years I’m sure that many are being scanned at 4K for theatrical release.

Although most of the digital projectors in theaters now are only 2K, the next wave of installations will include a lot of 4K projectors.

According to this guy:

So [del]if[/del] when I make a film, it will look good on DVD.

FWIW, I worked on a friend’s film in the '90s that was shot on 16mm (not super-16, which is the same frame-height but has a wider aspect ratio). On video you can’t tell it’s 16mm in most scenes. We used Fuji 125T and 250T. The faster film did show more grain.
.

Yip. We’ve yet to hit the point where digital is higher quality than analog. I know a lot of people in photography that got mad at this back when digital photography seemed to be taking over.

Even some older TV shows that were originally shot on film are being released in HD.

Another important point is that Kodak is improving the resolution and sensitivity of motion picture film faster than digital technology is advancing. So the gap between film and digital (for motion picture image capture, at least) is increasing.

Not sure I understand what you mean by “redo.” You mean rescan to digital? And I don’t quite get the second sentence. You could break out the old reels and watch them on the corresponding movie projector for the best possible quality. To watch them in HD, they would have to be scanned at HD resolution and converted to digital.

I seriously doubt this. Care for a cite?

I find this rather confusing. When digital photography seemed to be taking over? It has. Completely. They discontinued Kodachrome, for crying out loud. And in photography, the latest generation of DSLRs are higher quality than 35mm film in every quantifiable way except dynamic range (and that gap is fast being closed as well). Higher resolution, way higher sensitivity (i.e., much higher usable ISO values). Current bodies have more dynamic range than slide film did, but aren’t quite to where negative film was. Digital is higher quality than analog.

Yes, it is taken over, although I don’t think quite completely. Film photography is still alive and well, especially with large-format view cameras, which provide a much larger image size than full-frame digital sensors (compare full-frame size of 24x36 mm to view camera frame of 4x5 inches = 101.6x127 mm). Digital images are as good as film for most practical purposes (I have even blown up an image from a middling-quality APS-C sensor to 20"x30" with good-albeit-not-professional results), but here is at least one prowho says (page not dated)

Also, as I mentioned, HD images are a relatively low resolution compared to what even a low-end consumer digital still camera can shoot. So discussing digital vs. analog movies is a different yardstick than the same discussion for stills.

It’s a case by case basis. It depends entirely on what surviving source elements are available to the digitizer. And even 35mm isn’t the peak of resolution. For a truly astonishing experience, see The Searchers on bluray. The Searchers is one of only about 50 feature films shot entirely in Vistavision. With Vistavision, the film reel lies horizontally; the axis on which it turns is vertical. So the negative can be much, much wider than the more vertically oriented film stock. The Searchers bluray is a higher quality image than most *new *movies being put out in HD.

Okay, it has now, but he said they got mad back when it seemed to be, not now when it already has.

My source is the top engineer for Kodak’s motion picture films as well as other film experts without the same degree of self-interest as a Kodak employee. I’ll e-mail my Kodak contact Monday and see if I can get an online cite.

In the meantime, here are a few points to consider. The total digital cinema system does not obey Moore’s law, because it consists of an immense infrastructure chain encompassing cameras, post-production facilities, transmission systems, servers, and projectors. For the most part, the devices in the chain can handle 2K, and some can handle 2K and 4K. But you can’t improve the cameras by, say, 20% and see that improvement propagate through the whole chain. The chain is locked to the standard resolutions and bit depths, and can only be changed very slowly and at great expense.

The analog chain does not have the same limitations. A frame of film that has 20% more resolution is the same physical size as its inferior predecessor, is processed by the same lab with the same machines, and can be projected by the same projectors. As long as print stocks improve to match camera stocks, image quality on screen can be steadily increased with virtually no changes to the infrastructure. (FYI, there are 30,000 multiplex screens in the U.S. and more than 100,000 in the world. The majority are still 35mm, although digital is coming in quickly.)

In any case, at present there are no industry-wide standards beyond 4K. Any improvements past 4K will be prototypes and one-offs for special applications. For ordinary, non-IMAX theaters, 2K or 4K will probably be acceptable for the foreseeable future, meaning that there will be little reason to go beyond 4K.

At the moment, there are only a few cameras that are capable of 4K resolution, and the much-touted RED One uses a single sensor and a Bayer array, which means its real resolution is not 4K, but approximately a quarter of that. AFAIK, there are no production cameras beyond 4K available today. Some are coming, perhaps with 6K or even 8K, but there are no plans for projectors at those resolutions.

On the other hand, Kodak introduced three new motion picture camera emulsions in 2009, each with significant improvements in resolution and sensitivity over previous stocks. And more are coming next year.

So don’t fall into the trap of believing that digital is always better, and always advances faster, than analog.

This all might be true, but…

Full-frame (24 x 35mm) Digital sensors have doubled in resolution in 6 years - there’s no way that film is anywhere near that sort of improvement curve.

You’re talking about still image sensors, and I’m talking about the motion picture industry. Apples and oranges, for all the reasons I gave, and which I made clear in the post to which you responded.