Forget about Blu-Ray, HDTV, and 7.1 Surround Sound

But whoever made that comparison graphic used such a poor quality of JPG compression that it degrades the ultra. Size isn’t everything in some fields.

On the other hand, here’s a 1995 essay by Jakob Nielsen in which he tries to find an optimal display resolution for computer users. He assumes a 1200-DPI typeset quality display refreshing at 120 FPS on a screen larger than current ones, and comes up with a pixel resolution of 48k x 32k:

Of course, this is for situations where text and graphics hold still; I suspect that motion video requires less.

When creating VFX or CGI films, studios usually render at 2k… 4k TOPS. Which means when you watch a movie on the big screen at the theatre, this is the resolution you’re viewing it at:

2048 x 1556 or 4096 x 3112

2k resolution, which looks great on the big screen (I believe this is what Pixar renders their movies out in), is just a bit over HD. And you’re watching the movie on a screen the size of a small house.

Those have a width-height ratio of about 1.32 (4:3). Aren’t films usually rendered at something more in the range of 1.6 to 1.8?

For 35mm color film, a rule of thumb I use is the max resolution you can get out of a frame without bumping into grain is 3600 x 2400, or 2400 DPI. This should hold true for still or moving pictures. So that would be a practical max if 35mm movies are your standard.

Anamorphics introduce an asymetric height/width factor, but it’s still limited by the grain size.

Some movies, even going back decades, were shot on 70mm, so the numbers would have to be doubled. (Doesn’t IMAX use 105mm?) In the future, if movies are shot first on video, the limit would be whatever the camera allows. It’s certainly natural that the trend would be more pixels, not fewer.