Cinematographers help! (HD Format Question)

I’m shooting a short film next month for a class project. The film will be shot HD. I put an ad out, and the DPs who responded are asking me what HD format I want to shoot in.

I went to film school decades ago, and I’ve only shot film. I haven’t really kept up with the video technology. When the DPs ask me about the HD format, what exactly are they asking me? Are there benefits or drawbacks for specific formats?

For my film, we’re shooting all interior locations. There are some day shots and some night shots. There won’t really be an ECUs. There are 2 shots which will use a zoom lens. The screenings will be done on video projectors. I would like the option to print out to film, though.

Any suggestions as to what format works best? Or what the benefits or drawbacks of the different formats are? Thanks for your help.

I’m going for a sleep study soon (my doctor’s think I have sleep apnea), so if I don’t respond right away to a post, please be patient. Thanks.

HD video was defined by a committee, so there are several variations.

720P (1280 x 720, 60 progressive frames per second)
1080i (1920 x 1080, 60 interlaced fields per second)

…are the two main ones. But the camera makers introduced variations. The main one was to allow 24 frame per second video. This means you can shoot on HD, and have an exact frame match for 24 frame per second film. This is meaningless if the project is never going to be shown in a theater. Some people like the look even if the end market is TV or DVD…they wrongly claim it “looks like film”. No, it looks like “film shown on TV” and introduces motion artifacts like “judder”. Canon also has a 1080 30 progressive frames per second mode.

I know you’d like to preserve the option to go to film, but 24 fps introduces a lot more complexity with little benefit. If you shoot 1080i or 720P you’ll have a much wider range of editing and display options, and the end result can be shown on DVD or Blu-Ray. And honestly, most festivals are showing short film on video these days. If you shoot 24 fps, it will be converted to 60 fields anyway…so it makes more sense to shoot it in that format anyway.

IANADC (Digital cinematographer), but I probably know just enough about the subject to be dangerous.

As usual, Wikipedia has a very good entry on the subject.

Your candidates are probably referring to the distinction between 1080 (HDTV) and 2K digital cinema. (They may also be interested in knowing what recording medium – tape or hard disk – you intend to use, although this is generally less critical for their purposes than the camera type.)

In terms of pixel count, the difference between HDTV (1920x1080 pixels) and 2K (2048x1152) is relatively small. But the designs and features of most HDTV cameras derive from the television production world, and 2K cameras are designed more like film cameras. I could be wrong about this, but I believe that it is more common for 1080 cameras to record onto tape, and 2K cameras to record to hard disk. It is almost certainly possible to record either format on either kind of medium.

The 2K cameras typically shoot at 24 fps (vs. 30 fps for 1080), which will help when you make your film prints. 2K cameras are also designed to create images with a “film” look than 1080: they seem “softer” than video, and have a wider dynamic range.

In short, if your DP comes from the film world, he will almost certainly feel more comfortable shooting with a 2K camera than 1080.

OTOH, I believe that it is cheaper and easier to edit 1080 than 2K: video production houses that can handle 1080 are more common and the programs and hardware needed are somewhat less expensive. Since 2K is mostly used for cinema, it is less common and more expensive.

When it comes to exhibiting your finished film, many conventional multiplex theaters now have digital cinema projectors that are capable of showing a 2K digital cinema package (DCP). I believe that most are also capable of showing 1080 program material. But digital cinema projectors are relatively rare in non-theater auditoriums. AFAIK, no conventional 1080 projectors can show 2K.

So if you’re going to show the film in regular theaters, you could use a 2K or 1080 master (2K will look better), but if you’re going to show it non-theatrically, you’ll definitely need a 1080 master. It’s possible to shoot in one format and create a master in the other without too much difficulty, but if you’re going to exhibit primarily in one format, it makes most sense to shoot with that format.

BTW, are you committed to shooting digitally? It might not be as expensive as you think to shoot film, either 35mm or Super16. And you can still edit and master digitally if you want.

Which do you think looks better/has less artifacts–720P or 1080i? If I did end up printing to film, that would 720P be better than 1080i? Thanks.

That’s good to know. I’ll talk to my editor about it, and see what he’s set up for.

Would a conventional 1080 projector be able to show 720p? Or if I’m projecting, then do I want to shoot 1080i?

My primary exhibition format will be either DVD (possibly Blue-Ray) and theater video projection. Going out to film is a remote possibility. So, from your point of view, 1080i, 30 frames/sec would be the best?

I’ve never shot a digital film before, and I’d like to do one. Plus, I want to avoid telecine costs, and I don’t have time to coordinate all the movement (film stock to set, shot film to telecine house, telecine material to editor).

If you plan to have much in the way of fast movement, the interlacing on 1080i might cause problems. Absent that, I would go 1080i (or 1080p if you can).

You can also ask the DP’s that are responding to your ad, explaining just what you did to us. I know a few of these kinds of guys and they should be happy to advise you, or even do all the worrying about resolution themselve so you can concentrate on framing and other directing schtuff.

(…can you afford a Red One?)

On its face, it would seem as though 1080i should be better than 720p, but the limited reading I’ve done (including here on the SDMB) suggests that because 1080i is interlacing and 720P is not, the amount of information in the picture, and its apparent resolution, is about the same. But I’ve never done an A-B comparison personally, nor have I read deeply on the subject. But 1080p should be noticeably better than either. (I haven’t compared it, either.)

In answer to the rest of your questions, I would recommend getting in touch with someone who really knows what he’s talking about, like your editor. I’ve offered what I think I know, based on some reading, attending conference sessions about film and digital cinematography, and having hung around with some cinematographers. Going much further than what I’ve already said would be irresponsible and unhelpful.

Ok, thanks. This thread has been helpful, since now I understand the basics. I’ll work it out with the DPs who’ve responded to my ad and with my editor.

As garygnu mentioned, if you have a lot of fast motion, the interlacing of 1080i might be a factor. Panasonic is the principle proponent of 720P, and they have cameras that will allow variable frame rates if that is an element you need. One thing to consider is that, even though 1080i would appear to have higher resolution in a still frame, it has less resolution temporally. In any given 60th of a second, 1080i only records 540 lines of information while 720P records 720 lines. So in any given second, a 720P recording actually has more pixels.

Whatever you do, rent a high-end monitor and have a good low-light viewing area. A waveform monitor and a vector scope are as essential to capturing good digital video as a light meter is is to properly exposing film. Abobe’s OnLocation is excellent monitoring software.

This last statement is only correct for material recorded at 720p @ 50/60fps. The 720p spec also has a 24fps rate for movies, in which case it is noticeably lower resolution than 1080i, because 1080i allows for full resolution at 30 or 24 fps. Also, the higher temporal resolution of 720p is most noticeable in scenes with lots of fast motion, like sports. In normal static or slow moving shots (most movies and TV) 720p does not appear to be as high res as 1080i.

Well technically, in a given second 720p60 has the exact same number of pixels as 1080i60 or 1080p30. It’s all just an optimization of those pixels.

Anyway, why does 1080i even continue to exist? I understand it was designed for CRT displays which work very well with interlaced stuff. But doesn’t 1080i60 always have to be converted [poorly] to 1080p30 under the hood to work on an LCD or other modern technology? Or are there projectors/TVs that that can do the full ‘interlacing’ thing just like a CRT? Is it just that 1080i has momentum, and 1080p was never properly included in the HD scheme of things?

Or… here’s my other idea… 1080i isn’t really interlaced anymore and consecutive fields are the same point in time and it ends up without artifacts after deinterlacing. Ie, 1080i made to be just like 1080p?

Because TV is still transmitted over the air to receivers with antennas.

My understanding is that interlacing doesn’t have as much to do with CRTs (at least not modern ones) as it does with transmission rates. As you say (and Wikipedia agrees), interlacing was originally invented to overcome limitations in the scanning rate of early picture tubes. But I think that by the time TV transmissions were becoming common in the U.S., at least, interlacing’s main benefit was that it narrowed the bandwidth required to transmit a TV channel.

And that appears to have been a factor in the development of HD standards, too, since as Wiki points out,

It goes on to say,

So it seems we are stuck with the 1080i/720p split because, before it was obvious that ultimately everyone would have broadband in their homes, offices, and mobile phones, standards were needed that would fit within the bandwidths that TV broadcasters were using.

Just as newspapers seem to be falling into obsolescence, TV broadcast stations may one day fade away as the EM spectrum is used mostly for two-way net traffic instead of one-way broadcasts.

Well obviously 1080i60 needs half the bandwidth of 1080p60. But I’m asking why i60 instead of p30. I think the deal originally was that CRTs handle interlaced signals very naturally, and i60 is a very clever optimization over p30 for them. LCDs however simply can’t do interlaced. They have to convert i60 to p30, losing the framerate advantage and even introducing artifacts because you have to essentially meld two images from different points in time into one.

At least that’s how it seems it works, but i have the feeling there’s something more going on. Either i60 isn’t interlaced the way it’s “supposed” to be anymore and actually it’s more like p30 but split across two frames (ie, when you put the images back together they just fit together with no trickery). Or someone’s thought of a way to let LCDs to do true interlacing so that the interlacing optimization is as effective for them as it was for CRTs.

Anyway, if the problem hasn’t been solved then it’d be pretty fn dumb to record anything in true 1080i at this point, I would think.