High end Blue Ray disc displayed on full sized theatre screen. How does it look?

What would the highest end Blue Ray disc for home entertainment systems displayed on a full sized move theatre screen look like? Acceptable or awful?

Depends upon what you mean by highest end Blue-Ray. The normal Blue-Ray disks contain videos at 1080p, whilst the disk format can manage 2160p aka 4k - and there are now players just coming to market that also support that. Disks with 2160p content are like hen’s teeth, and it remains to be seen whether the format takes off.

A full 1080p image projected on a screen at accepted cinema viewing angles is remarkably good. “Acceptable” would be understating the quality. A 2160p image would be considered cinema quality, even for quite critical viewing.

However another benefit of the high resolution blu ray is a wider colour gamut. Newer screens can provide a wider gamut, and now the disks are going to be able to take advantage of that. That may actually be a bigger visual improvement than the resolution.

I’ll take a stab at this…

1080p is 1920 x 1080 or a bit over 2 Mpixels. Most theaters showed 35 mm film when they where using film. A few showed 70 mm. Let’s consider 35 mm.

A decent 35 mm positive has a grain size equivalent to around 20 Mpixels. So, the BlueRay image would look very grainy and nowhere near as sharp as 35 mm film. This would be obvious on even a modestly-size theater screen.

You are assuming that the film is shown at its maximum possible magnification.

Comparing film grain size to digital pixels is not very accurate. Film has a maximum resolution of grain size, that is not always met. Grains blend into clumps. The original equipment, film, lighting, lenses, usually does not allow maximum resolution. Then the chemical and optical processing again can reduce the resolution. Then the projection equipment also reduces resolution. Actual resolution can be 10 times less than the film is theoretically capable of.

70 mm Imax can be equivalent to 8k digital. Most digital theaters use 2k or 4k. 2k is roughly twice the resolution of 1080p that you usually watch your blu ray movie at home. So considering the distance you sit from the movie screen, the fuzzing of the projection system. You might find a 1080p movie to be pretty good in a theater. But if you could see a split screen comparison of 1080p, 2k and 4k. You would notice the difference. I can increase the resolution quite a bit, by cleaning my glasses too.

2k is 2048×1080, isn’t it? 1080p is 1920×1080, so there’s very little difference in resolution.

Probably would be tough to spot, even side by side. They probably just boosted the horizontal resolution due to the really widescreen movie formats. Panavision.

Yes, 2k/4k is a confusing pair of terms. 1080p is essentially 2k. The 2k refers to the long edge, not the short edge, whilst 1080/2160 refers to the short edge. 2k is 2Mpixels, 4k is 8Mpixels.

A still 35mm image might get you close to 20Mpixels on a very very good image. A 35mm still image is 24x36mm. A film 35mm image is about one half that area - they run the film the other way around - and usually allocate space for the sound - anything from 22x19mm to 24x18.6 (but only if an anamorphic lens is used). So you are down to half already. In reality, even a very carefully set-up shot on film you would actually use most of the time (ie not a slow super fine grain film) is closer to half the maximum resolution you could hope for. So another half. Down to 5Mpixels. Which is less than a 4k video image (8Mpixels).

However it gets worse. A motion picture is not a still image. The amount of motion blur in motion pictures is quite high. The only way you get the high resolutions with a still camera is with either very fast shutter speeds to reduce movement blur (and then pay the price of a reduction of depth of field) or lock the camera solidly on a tripod, and often lock the mirror up. None of this is an option with a movie camera.

In a theatre the biggest determinant of your useful resolution is going to be the angle subtended by the screen. THX makes specific recommendations about the useful range. In the end, 2k is actually pretty good, and most people are quite happy with it. 4k is noticeably better, but not enough that you would really worry too much.

I have a 1080p projector (a very old venerable Barco 808s) that projects a 100" diagonal image. It produces a very cinematic experience. With a well authored Blu-Ray it is more than satisfactory. Can I see resolution limitations? Yes. But one needs to actually go looking for them. Just sitting back and enjoying the movie, you never notice.

As someone who’s seen a few films that were played from either DVD players or Blu-Ray players in theaters, the DVD’s looked absolutely awful but the blu-rays were as crisp as you could get in terms of normal viewing, didn’t see anything that made it obvious it wasn’t film except when the film ended it immediately went to a blu-ray main menu much to the amusement of everyone in the audience.

As a frat boy 15-20y ago, we used to use some philips device (480p?) to produce some kind of cinema on a neighbors flat (like socialist block of flats with one side without windows). Looked just fine to me.

Doesn’t it have a lot to do with the screen size and the viewing distance? ISTR that there are various charts that give the appropriate viewing distances for HDTVs at various resolutions, so I’d think it would follow that the same kind of thing would apply for cinema screens.

By “highest end” Blu-ray, this technically means UHD 4k (3840 X 2160). There are lots of 4k Blu-ray titles. Most theaters don’t even have 4k projection equipment.

A consumer 4k player can probably downscale to 2k, so in theory you could feed that into the theater’s 2k projector. 4k Blu-ray is 64 to 128 megabits/sec encoded using H.265/HEVC at 10-bit color depth and 4:2:0 chroma sampling: Chroma subsampling - Wikipedia

Most theaters receive DCI 2k (2048 x 1080) Digital Cinema Packages encoded using JPG2000, (essentially a collection of stills): Digital Cinema Package - Wikipedia

This is much less compressed than H.265 or H.264. I think the DCI 2k spec is 12-bit color depth and 4:4:4 chroma sampling.

So your question is actually more complicated than first appears. A 4k Blu-ray can definitely supply more resolution than most theater projectors can handle, and if you could switch back and forth between a 4k Blu-ray projected at 4k vs a DCI 2k source projected at 2k, despite DCI’s greater color depth a 4k Blu-ray would probably look better to most people – if they noticed at all.

For a 2k DCI projector, if the 4k Blu-ray was downsampled to 2k, in theory the DCI package might look better due to having 12 bits per channel color depth and 4:4:4 chroma. However I doubt in reality most people could see the difference.

For a 4k theater using DCI 4k (4096 x 2160) content, in theory that should look better than 4k Blu-ray fed to the same projector since the resolution is a little higher and color depth and chroma sampling are better. However I doubt at typical viewing distances if many (or any) could see the difference.

Some codec-stressful scenes such as rapid motion, flash strobes going off, etc. might conceivably reveal imperfections due to Blu-ray’s H.265 encoding. E.g, it might momentarily reveal pixelation or macro-blocking, especially on a monocolor gradient. By contrast DCI uses JPG2000 which is purely intra-frame (doesn’t compress across frames) so there is virtually no chance of these artifacts.

I believe that the first digital cinemas were 720p. Parts of Star Wars - Phantom Menace was shot at 720p. Attack of the Clones was shot at 1080p but I think most digital theaters projected it at 720p.

Everything shot and broadcast by ABC, Fox, and ESPN is in 720p, and I doubt most people notice the difference between that and the 1080i content from CBS and NBC.

Re 35mm cinema film resolution, this varies widely. Various film stocks are used. E.g, in indoor scenes of Barry Lyndon, Stanley Kubrick used the fastest available 35mm film (I think ASA 400 at the time) and push processed that to 800. The resolution wasn’t that great; I doubt it was close to 2k, but it was the first film ever shot using only candle light, and doing that required a NASA/Zeiss lens costing millions of $: Carl Zeiss Planar 50mm f/0.7 - Wikipedia

Besides varying film stocks, cinema film is color negative so it must be printed for projection which degrades resolution. By contrast 35mm slide film is color positive (aka color reversal, or transparency) film. The exposed film itself becomes the image, without any generational loss. Kodachrome 25 slide film might have produced 3500 x 2300 resolution, but this does not apply to color negative cinema film.

I heard that the Olympics are being broadcast in Japan in 8k.