Watching HD, looked like videotape ... what gives?

I saw the LOTR films in the theater, and have watched them repeatedly on DVD.

Saturday night, I saw part of Return of the King on a large flatscreen (probably in excess of 48") and couldn’t help but notice (as did others present) that parts of the image looked especially artificial. The way I would describe it is that it looked like videotape rather than film. Oddly, it seemed to be the live-action parts of the image moreso than the CGI bits.

What causes this? Was some setting on the TV adjusted poorly, or is this an artifact of HDTV compression?

Saw most of I Am Legend afterward, and didn’t notice this effect nearly as much.

Yup. Motion interpolation, aka “the soap opera effect”. It’s a “feature”, not a bug. Feel free to turn it off. (Though it is nice for sporting events.)

Yes it has something to do with the TV’s settings. I don’t have a thread or a cite handy but it’s been mentioned a few times on the board before (tho maybe not a whole thread dedicated to it). My Sony Bravia does this and I haven’t really been able to tune it away. I just live with it now and pretend like I have special live-action screeners of all my favorite shows.

Not all shows I watch come down like this (I watch AVI or MP4 files of tv shows, not discs or broadcast) but the worst is Parks & Recreation.

Helpful to know, and the sporting events comment makes it make some sense as to why it would be enabled on a big-screen TV in a restaurant: that TV probably spends more time tuned to ESPN than to TNT.

Of course, right after this I read a news report about how The Hobbit is going to be released to be projected at 48 fps, creating a look that some describe as “too realistic.”

The preliminary response to several 48fps tests has been less than stellar. My guess is it will be downconverted to ye olde 24fps for exhibition.

If Jackson is to be believed, you need to get used to it – the 10 min demo they did wasn’t enough. I’m wary, but I also despise the nasty stutter & blur you get in panning shots, both film and HD. I’m hoping Hobbit does come out in 48 so we can get a feel for it as audiences. I’m very curious.

What was the program source?

Cable TV and Sat providers tend to compress the hell out of channels so they can squeeze more into available bandwidth. Also, for TV the presumably 1080P/24 source has to be changed to 720P/30 or 1080I/30. This reduces the quality considerably and can cause motion judder.

As mentioned before, some TVs also do “creative” things to try to improve the picture.

I thought The Hobbit was done in 48fps so that the 3D would be extremely sharp when you split it in half to 24fps.

I just got a new Samsung 55’’ slim LED smartTV, and nearly everything I watch on it looks like videotape instead of film, which I actually like. It makes everything look so real and “right in front of me”, as compared to a regular film which comes off as odd to me now.

But I agree not everyone is a fan. I had a friend over last week to watch an episode of Modern Family on Hulu +, and he hated the way it looked.

Along with motion processing, another important factor is ‘Upconversion’ - without it, a standard DVD would fill one quarter of the HDTV screen (or less). Hulu and other streamed sources can be worse.

So, since there are too few pixels to fill the screen, upconversion just magnifies the picture, guessing at how to fill in the missing pixels. This can look good… Or really bad.

The best image you can see on HDTV is probably BluRay, but streaming will probably catch up.

I do believe that a source shot at an actual 48+ FPS will look “better” than a source that is 24/30 fps and has duplicate frames interpolated in.

IMAX is an example. It’s shot at 48 FPS and definitely makes the scene look much more “realistic” than film. And it never bothered me. Of course the IMAX screen which takes up almost my entire field of vision here in New Jersey might also affect my experience.

Personally, I hate film judder, so good riddance.

If you were watching a satellite or cable channel in a restaurant or bar - maybe instead of the HD channel they typically use for sports, they were tuned to a standard-definition channel for the movie.

I was shocked by all the people who were dissing the 48 fps showings as “too realistic.” Yes, it looks different, that is because it is BETTER. Do you really want an inferior product just because you are used to it? Get over it.

Basd on what criteria?

No judder. :wink:

Anamorphic DVDs are designed to be used on widescreen (including HD) TVs and use the entire screen, assuming you set your player (and I’m talking about a standard DVD player here with no “upscaling”) to 16:9 and your TV to widescreen (you know, the setting on the TV that most people misuse for regular 4:3 SD broadcasts to stretch it to fullscreen). It won’t be HD, of course, but will be 720x480p.

I think he’s talking about the fact that it would be 720x480, and that isn’t going to fill up your TV screen, so upscaling is required and it’s done either by your DVD player or your TV.

A higher framerate is better in that you have more information from the scene recorded. You might more reasonably ask by what metric some people judge 24fps to be “better” - it can’t be articulated, because it’s only that its shortcomings are familiar.

This is the same phenomena that resulted in many people complaining that DVD didn’t look as good as VHS, and that Bluray (and HD-DVD) looked distractingly “wrong,” “cold” or artificial. Some people are simply neophobic and get hung up on anything that varies from their prior experience.

I don’t remember anyone complaining that VHS looked better than DVD, but some still claim *today *that vinyl is better than CD so…

High definition doesn’t specify a frame rate, just a screen resolution (i.e. so many pixels wide by high). Consequently still cameras can take HD pictures. But because HD is all-digital the frame rate can be easily varied at will. Many consumer camcorders can shoot in both 30 & 24 fps. But shooting in HD at a lower frame rate isn’t really an oxymoron because frame rate controls motion blur which, although technically delivering less giga-pixels per second, has a more noticeable & emotional aesthetic effect that we intrinsically associate with film vs. video (i.e. that we associate with higher quality even if it’s actually being delivered at a slightly lower data rate than 30fps HD)…

It was something like this - except buddy left off “It’s too crisp!”, “Certain types of scenes show compression artifacts - you can really notice them when it’s paused!”, and “Panning and motion looks artificial and computery!”

I am not worried about a 48fps Hobbit looking “soap opera.” Peter Jackson knows how to light a scene and stick the right people in front of a camera - it’s going to look awesome.