So how large is an uncompressed film?

Films on DVD and Blu-Ray are compressed. So, how many DVDs or Blu-Rays would an uncompressed film (say 90 mins) need? I can’t seem to find this information.

I’m not sure if anybody’s ever designed a completely uncompressed video format, but let’s work it out from first principles:

[ul]
[li]There’s 24 bits worth of color information in each pixel of each frame of DVD video[/li][li]A DVD frame is 720 pixels across, and 480 high.[/li][li]The US DVD frame rate is approximately 24 frames per second (Region 2 DVDs are closer to 30?)[/li][li]Multiply that by 60 seconds in a minute, and 90 minutes, and you get…[/li][/ul]

…a little over 125 gigabytes for the video alone. I’m not sure how much the uncompressed audio would take, but I think it would be fairly small in comparison.

For blu-ray, you’ll need to find the appropriate values for frame rate, pixel color depth, frame size in pixels, and substitute them in.
(Remember that there are 8 * 1024 * 1024 * 1024 bits in a gigabyte)

In the production diaries of the new hobbit movie, it’s mentioned that they’re filming with a HUGE resolution, in 3D, at 60 frames per second.

I once added all those numbers together, and the result was that one SECOND of uncompressed footage would be THREE gigabytes.

Does 3d mean that they actually need to store hundreds of different pixels along a Z axis for each frame? Or maybe just that every pixel needs extra information regarding its ‘Z depth’, but there’s still only one relevant pixel at each X-Y co-ordinate? The second would be much less data-expensive. I have no idea how the new 3D technologies work in terms of this stuff.

Filming in 3D simply means they have two cameras, one for the image for the left eye, the other for the right. So you end up with double the amount of data.

Of course, this means that the lenses of the cameras should be exactly as far apart as the eyes in a person’s head. Because that isn’t really feasible with today’s cameras, they use an intricate system with mirrors to make sure it works out anyway.

D1 and D5 videotapes are completely uncompressed, standard definition component digital formats. D2 is a completely uncompressed standard definition composite video format.

But as far as formats a consumer is going to find, yeah, there’s been no digital video format that didn’t have lossy compression.

[quote=“chrisk, post:2, topic:611566”]

[li]The US DVD frame rate is approximately 24 frames per second (Region 2 DVDs are closer to 30?)[/li][/QUOTE]

US movie DVDs are exactly 24 frames per second. The conversion to 29.97 fps is accomplished in the electronics of the player. And PAL format (Region 2) are 25 fps.

An uncompressed 24 bit image at 720 x 480 is .98 megabytes
An uncompressed 24 bit image at 1920 x 1080 is 5.93 megabytes

An uncompressed 2 hour DVD would be 169,344 or 169 gigs.
An uncompressed 2 hour BluRay would be 1,024,704 or just over a terabyte.

But the thing is, movies shoot in a much higher color depth than 24 bit, at much higher resolution.

Now I’m wondering how you get the kind of resolution you need to shoot on a full size movie screen and have it look good.

Virtually all the theaters I go to these days use Sony “4K” digital projectors. They use a number of different aspect ratios, but the maximum resolution is 4096 × 3112.

Ok, well I did some research and yer standard IMAX screen is 22 meters, or 72.6 feet wide. A little more math reveals that a 4096 wide resolution, each pixel must be roughly (actually, .2127 so very close) one fifth of an inch wide. On my home computer screen that would be pretty freaking bad resolution, but I guess at the distance you sit from the IMAX screen, it works out just fine. So they use a fairly high but not ungodly high resolution, and just bump up the size of the pixels. By comparison, getting standard Web resolution of 72 pixels per inch on an IMAX screen would require 62,726 pixels.

Interesting coincidence, that the standard size of an IMAX screen is 72 feet, and standard Web resolution is 72 dots per inch. Probably the Trilateral Commission … AGAIN!

DVD’s are stored at 24hz? Is there any way to play them back at this rate? My PS3 will play Blurays at 24hz, but DVD’s only come out at 60hz.

It’s a moot point, since I find I prefer 60hz anyway when watching Blurays, but I’m just curious from a technical standpoint.

hmm… 72 = 3 squared times 2 cubed…

Just a guess: It could be done now with by DVD player with appropriate firmware and hardware since we can now upconvert and use digital outputs, but there’s not a lot of interest in doing so since the people that are really picky about quality would be using blu-rays anyway. DVD was done that way so as not to waste space with redundant information, not to allow devices to display film at 48, 72, or 120 hz since such devices didn’t exist at the time.

Yep. I use a computer to do so, setting the scan rate to 72 hz and showing each frame three times. Works great. If my display could handle it, I’d go to 120 hz, which is nice because it is a multiple of both 24 hz and 30 hz. Presumably there are stand-alone players that handle this automatically when hooked up to a display that will let the player know via the HDMI connection that it can accept a 120 hz display rate.

Exactly! How can it not be a conspiracy! Numbers don’t lie!

Actually it’s being filmed at 48fps.

Well, RED is compressed “visually lossless”, so it’s quite a bit less than that, but if it’s uncompressed at 5k (I believe that’s 5120x2700) @ 14bit, my napkin calculations yield 14bits * 3 * 5120 * 2700 = 580mb/frame or 48fps * 580mb * 2 cameras = ~55gb bytes/sec uncompressed. Not sure if that’s accurate, but it seems in the ballpark.

And just think: RED plans to offer a 28k sensor (Epic 617) that will do 28000 x 9334, which my napkin calculator yields as 10gb/frame. I’ll leave the gb/sec as an exercise for the student.

Yep. Although calling this “3D” makes me crazy - it’s stereoscopic, like a pair of binoculars, not “3D”.

Looking back at this, I see I screwed up bits for bytes using 14 * 3 as a number of bytes per pixel where it’s really 14 * 3 bits per pixel. So to get the bytes per second the whole thing needs to be divided by 8. My napkin then yields 72.5mb/frame, or 6.9gb/sec for shooting 2-camera 48fps in the Hobbit film. and for single camera RED Epic 617, the napkin now says 1.3gb/frame.

After some googling, I see that last number is nearly but not quite right, with Red Epic 617 quoted as 1.46 GB per frame here. Apparently my new number is a little low because RED puts those 42bits per frame into 6 bytes, wasting 6 bits per pixel. My napkin backs me up: 6 bytes/pixel * 28000 * 9334 does indeed equal 1.46 GB/frame. If we apply that correction back to the Hobbit numbers, that’s 6 * 5120 * 2700 = 79mb/frame, a total bandwidth of 79482 = 7.4gb/sec.

Phew.

Please enlighten me as to the difference. Isn’t a stereoscopic image interpreted in the brain as 3D?

Sure, if you want to look at it that way. But real life is more “3D” than that. If a film were were truly “3D”, I could stand at different points in the theater and see past objects that other people in theater could not, because the viewpoint is different. I could choose what thing to focus my eyes on, and those things in front of or behind object would be out of focus. I’m not a film “3D” detractor by any means, I mostly like it. I just object to the nomenclature.

It would be negligable. If 80 minutes of uncompressed, CD-quality, stereo audio can fit on a 700MB CD disk, you’ve got less than 1 gig for 90 minutes, unless there is significant overhead combining it with video. So less than 1% of the video storage is needed for audio, and no more than 4% for 7 channels of surround sound.

Video resolution is increasing faster than audio. Since our ears aren’t getting any better, about the only improvement in audio would be multi channels, not more bits per channel.

Since I haven’t been in a movie theater for 25 years, tell me – is that the way 3D movies work nowdays? Or are they effectively only stereoscopic?