HDTV looks like video

We watched To Have And Have Not again last night, on the new HDTV. The new DVR has a functioning HDMI port (unlike the previous one, which had a faulty port). On the big screen (i.e., projected film) you see grain, which is what I think makes film look better than video (IMO). The 70-year-old film we watched last night did not have detectible grain. It looked like it was shot on B&W video.

A friend is of the opinion that film has a much higher resolution than video, and HD video just shows that superior resolution. That may or may not be true nowadays, but the one thing film does have is grain. Slow stock has finer grain, and fast stock has more grain; but there’s still grain. Given that THAHN (in this instance) was shot on film, why was there a lack of grain in the image, which made it look like video?

It is possible to remove grain when digitizing old films. If the film was in particularly bad condition when it was transferred, it may have been filtered in such a way that the visible grain was minimized.

However, since you said it is a new HDTV and that the image looked like video, I suspect your TV may have come with motion interpolation turned on. This leads to the so-called Soap Opera Effect, in which the film originally filmed at 24 Hz is shown at 60 Hz or more using frames interpolated by the TV. For a variety of reasons, this ruins the “look” of many movies and TV shows and causes people to complain that they look like they were filmed on a cheap camcorder.

This seems likely. I’ve tried altering the motion interpolation, and I did get a more cinematic picture when we watched the first movie on it (Alien, FWIW). I can never seem to find that option again though.

Thanks for the explanation. It answers the ‘why’ to the ‘what’ I kinda/sorta knew.

It’s normally called something like “motion enhancement” or motion smoothing or something obscure in the menus. Basic rule: Turn “off” any frame interpolation. If you have a 120hz TV and a blu-ray player you can make it more authentic too by disabling pulldown, if it doesn’t do so automatically.

Upon further reflection, it’s quite possible that the motion interpolation is responsible for removing the grain as well, since the random noise due to the grain will be uncorrelated between frames and the interpolation may result in it being averaged out.

HD is nowhere near film resolution - assuming we’re talking about Hollywood-grade film in good condition.

When film is digitized for post-processing and effects work, as nearly all movie content these days is (shoot on film, digitize, post, and print back to film for theaters that aren’t yet digital), it’s done at resolutions about 4 to 16 times higher than current HD.

What’s called 2K is almost the same as HD and is considered basic or quick resolution these days. 4K is four times HD; 8K, used for most Hollywood productions these days, is 16X, and 16K is coming into more common use. For older films, optimum resolution transfer is around 8K. For newer films shot on large-frame film, 16K is necessary for maximum resolution.

So the lack of grain in your film might have been from filtering and might have been because pristine B+W films have almost no visible grain - you are thinking of older prints, 16mm prints and television viewing. A first-generation print from a 1940s film would probably not have discernible grain on a normal theater screen. (At least, not in daylight or full-light studio scenes. Darker scenes used faster film which will show some grain, the same way digitally-shot night scenes will show speckling and artifacts now.)

It’s called something different on every TV.

Google [Your TV Make] + Soap Opera Effect and from there you should be able to find the name of it. Either you’ll find a forum where someone’s complaining about it on their TV or a website that lists what it’s called on all the different brands.

For how many people hate it and for how few movies it’s actually helpful on I’m surprised it’s on by default. I almost returned by TV when I got it because of that until I found out what it was. Watching Modern Family just about made me sick.

It should be noted that when converted for display on a TV, the frame rate has to be changed anyway since TVs don’t run at 24 fps, even older TVs, which (for the U.S.) used a 60 Hz refresh rate divided into two interlaced fields at 30 Hz, thus movies were converted for display at 30 fps (this is mentioned in the article you link to). With 120 Hz TVs, this is a bit simpler since all you need to do is display one frame 5 times in a row (but since said TVs are not universal yet, they still use the older standards to transmit video, or 30 x 4 for 120 Hz, thus the circuitry inside the TV to interpolate frames, which does add yet another layer of processing).

Yeah, you’re right, I overlooked the portion where the OP mentioned a DVR. Blu-Ray players support native 24 Hz playback though and most of my knowledge of this sort of stuff comes from setting up my Blu-Ray player with my HDTV.

In terms of the difference between the ‘film look’ and the ‘video look’ (or ‘soap opera effect’) the single most significant deciding factor is not film grain, nor resolution, nor lighting contrast, the biggest factor is frame rate. Because of a quirk in technological advancements & standards decisions and human visual physiology, so-called ‘motion blur’ disappears somewhere between 24fps and 30fps. And it is this lack of motion blur which blurs the line (no pun intended) between fantasy & reality, between fiction & fact, between a movie and a documentary.

It’s worth adding that there is a disparity between what’s perceived as motion-image quality and what is higher quality on a technical plane. Even as a reasonably informed amateur filmmaker, I clung to the notion that 24fps was inherently superior for a long time. I think a lot of people, including those who should know better, hold onto that notion.

24fps produces a specific kind of blurring when there is motion. 30fps produces far less. There is no question that higher frame rates (up to 48 and 60 fps) produce more and more detailed recordings, with less and less blur even during fast motion, and are thus technically superior to the blur of film frame rates.

But we have been conditioned, mostly because of the former resolution difference between film and video, to see 24fps as the superior image, and 30fps as cheap, bright and grainy. We look for the blur as a hallmark of quality. Now that video can match or exceed film’s resolution, we’re going to have to learn that sharp and bright is the real quality, and let filmmakers dial it up and down as they do any other aspect of film technology.

I was thinking while watching The Hobbit at 48fps that the technology has its place, but films might benefit from selective use - as they adjust 3D depth and even use, and switch from regular widescreen to IMAX (as in the Nolan Batman films), filmmakers could shoot at 48fps but dial most scenes down to 24, going to 48 when needed to enhance sharpness, fx, motion, etc. It’s another dimension of audience perception to manipulate.

Imagine the new Star Trek films, for example, where all the ordinary interiors are 24fps, but the space scenes get kicked to the hyper-reality of 48fps. (Think how sharp the lens flares would be! :slight_smile: )