Will 60 fps movies fit on blu-ray?

What about 48 fps? If not how they are going to release ‘The Hobbit’?

Depending on compression, up to 9 hours of 1080HD footage can fit on a dual-layer BluRay. By over-compensating the settings (which are infinitely adjustable), they fill it up to the brim, which would also accommodate for the increase in frame rate.

Not sure if they’ll be releasing it in 48fps on BluRay anyway, as it’s not a standard rate that players and TVs can necessarily cope with. Or maybe they can, not sure.

The plastic itself can probably hold the data (as GuanoLad said), but the software format (and thus players) have specified limitations that are supposed to adhere to the official specifications. Wikipedia lists them (citing the Blu-Ray Disc association):



Resolution 	Frame rate[a] 	Aspect ratio
1920×1080 	29.97-i 	16:9
1920×1080 	25-i 	16:9
1920×1080 	24-p 	16:9
1920×1080 	23.976-p 	16:9
1440×1080** 	29.97-i 	4:3
1440×1080** 	25-i 	4:3
1440×1080** 	24-p 	4:3
1440×1080** 	23.976-p 	4:3
1280×720 	59.94-p 	16:9
1280×720 	50-p 	16:9
1280×720 	24-p 	16:9
1280×720 	23.976-p 	16:9
720×480 	29.97-i 	4:3 or 16:9
720×576 	25-i 	4:3 or 16:9


So 60fps is ok if you downcovert to 720p. 48p probably has to become 24p or 60p. Realistically, they’ll probably release the Hobbit at 1080p * 24fps and nobody but a niche audience will ever notice.

For TVs anyway, most of them run at 50/60 Hz (Europe/U.S.) with newer models capable of 120 Hz and even higher. Of course, the frame rate should be some submultiple of the refresh rate to avoid interference (tearing) between the two (as I understand it, movies which run at 24 fps in the theater are altered to run at 30 fps* on TV, at least on older TVs, newer TVs may not have any problem though, like a computer monitor which can run at multiple resolutions).

*fps and refresh rate are different things; older TVs had a refresh rate of 60 Hz but a frame rate of 30 Hz because they interlaced two fields, a holdover from early TVs and transmitters which couldn’t handle the higher video frequencies/bandwidth needed for non-interlaced/progressive scanning.

the next generation of displays are 4k (3,840 x 2,160 resolution). Sony is soon releasing a $25,000 84 inch 4k display, and a hands-on demo showed that the amazing spiderman was a 56.4 gig file.

the hobbit is going to be released at this resolution.

No matter how it’s done, some form of pull down will have to happen to get it to play on an HD set. Personally, I would have taken that into account and filmed at 50 or 60 hz. As far as the Blu-ray specifications, they can be changed. The beauty of Blu-ray is that you can do firmware updates on-the-fly, rather than having to get new hardware. If he’s going to insist theaters upgrade to be able to play the video, then it should be even easier on Blu-ray.

Sure, just to be safe, you’d go ahead and release a 1080p version at 25/30hz and possibly 720p at 50/60 hz, but there should definitely be a better version. Heck, it wouldn’t surprise me if the 4k SuperHD videos will be released on Blu-ray.

Maybe he did it so that the 24hz downconversion will be easier. Or maybe it’s just a recording equipment/camera issue.

How does BD handle 3D? Can’t they just encode it as a 3D file and watch it without glasses?

You don’t need to do pulldown anymore. A lot of mid-range to high end HDTV or projectors will accept a 1080P/24 signal and mutiply it by 5 to display it without pulldown. 120 hz that decent sets do isn’t a number they picked out of a hat, it’s the least common multiple of 24 and 30 so you don’t need to monkey with the frame rate whether it’s TV or blu-ray. That works out well for the Hobbit too, since you can just drop every other frame and get a format that’s displayable on home equipment without reintroduciing the motion judder we’ve finally ditched.

As to the original question, if the Hobbit won’t “fit” on a single Blu-ray, they’ll just put it on two like they did the extended LOTR.

IIRC the original 480i was interlaced because the phosphor persistence for a moving picture was too short. Doing 480p at 30fps would make the screen flicker noticeably. As the lower part of the screen is drawn the top would be fading out and vice versa. Instead, they drew an entire frame, but only half the lines, every 60th of a second.

By all means, fight my ignorance if I’m wrong, but isn’t a regular DVD 480p at 30fps?

That may be true, but it is effectively displayed at 60 fps (showing the same frame twice), since TVs, unlike computer monitors, are fixed frequency, or at least older CRT TVs were; you need a bunch of fancy circuitry for multi-frequency operation, and no monitor I know of went down to 30 Hz refresh (which would look really bad; I still use a CRT monitor on my computer set to 85 Hz; at 60 Hz, flicker is noticeable at times). Note that LCD monitors don’t have any issue with persistence or different scan frequencies (but they still have H and V sync and oscillators that must lock to them, with some lock range, say +/- 10%), so you can display 30 fps natively, but they still run at 60 Hz (or higher) too.

Incidentally, movie projectors do something similar so they can get by with a relatively low frame rate; the effective frame rate is 48 or 72 fps (but you actually only have 24 unique frames per second).

The P in 480p stands for “progressive scan”, which means every single row of pixels is in the image data.

If it were interlaced, it would be 480i. The I standing for interlaced. That holds for all the HD resolutions, 1080p is progressive, 720p is progressive, but 1080i is interlaced.

Anyway, my understanding is that standard DVDs are always 480i, if your DVD player it outputting 480p that is because it’s “upscaling” (meaning, it’s filling in the missing lines with an algorithm of some sort.) I could be wrong.

480i and 480p are the same number of lines per frame. 480i can be made into 480p by storing the first field (1/2 frame) and displaying each line alternately with the lines of the second field. This does not require generation of any interpolation data, and it removes the comb effect.

I think you might have misread my post; whether interlaced or progressive, the refresh rate of a TV is always the same, 60 Hz; interlaced scan displays half a frame in every 1/60 second interval while progressive scan displays one full frame in the same period, with two identical frames in 1/30 of a second; the actual frames per second rate in either case is 30 (a 120 Hz TV would just double these numbers, but the actual fps stays at 30; advanced TVs can interpolate entire frames so as to create smoother motion):

Anyone remember slow-scan Ham Radio TV? The frame rate was 8 sec (not 1/8 sec) IIRC. You coul see a very bright line work its way don the screen. Even at low rez, the bandwidth only permitted that slow a transmission frame rate.

The original North American transmssions were 30fps 480i (NTSC). The station transmitted the odd, then the even lines, because the engineers found the old CRT glas tv tubes in the 50’s would flicker visibly at 30fps full frame so that was the standard decided on. Until the days of fancy electronics and flat-scren TVs that’s how most TV worked.

Today, we have cheap electronics and can store a frame or several, so players and HD stations send 480p, 720, 1080 - whatever format the TV will take. LCD/plasma can handle several formats and display them as required, even bump them up to 1080p if set to do so.

There’s a whole technology on how to convert 24fps old films into 30fps television and what works best. Another topic.

The main reason for this was due to the limitations of vacuum tubes; as CRT computer monitors showed, there isn’t any reason why you can’t use far higher resolutions; those early TVs were limited to video bandwidths of several megahertz (compared to 200 MHz or more for high resolution CRT monitors; the highest definition displays today even exceed the capabilities of a single video cable, necessitating multiple cables), plus you’d need much higher transmission frequencies (progressive scan requires twice the video bandwidth). Also, the frequencies were fixed to the local power line frequency because early power supplies weren’t regulated much if at all (also the reason for overscan, so low line voltage wouldn’t shrink the picture past the edges of the screen) and had significant power line ripple; making the refresh rate the same (actually not quite the same) would reduce beat interference between the two (hum bars moving down the screen are much less distracting if they are slowly drifting).