So I was in the doctors’ office the other day and they were playing Miracle on 34th Street (the '94 version) on what appeared to be probably a 40" LCD TV. In terms of picture quality, it looked much crappier than I remember the movie looking. The speed of the motion seemed to be a teensy bit faster than normal and made the movie look like a cheap, made-for-TV movie, which was strange as I didn’t remember it being like that when I had watched it years ago.
In the waiting rooms they had small, older CRT TVs playing the same thing and on those screens the movie looked, well, more movie-like with the motion speed seeming “correct.” So what was it about the movie playing on the LCD TV that made it look so fake and bad like a soap opera or a made-for-TV movie? Is this what the 60 Hz vs 120 Hz argument is about?
I think so. IME TVs with 120/240 Hz processing with frame interpolation tends to make film look like video, just like you describe. it’s great for sports and gaming, but for watching movies from a film source I turn it off.
This all has to do with the fact that, due to the way it’s been marketed and implemented, high-definition video “feels” like lower quality to most people. The hi-def picture you’re complaining about has more frames per second than the standard tv shots we are used to seeing. It’s actually higher quality than regular “low-def” cameras provide. We’re just not used to it.
I hope this stigma goes away soon (that hi-def is only appropriate for sports, soap operas, and porn). It seems silly to purposefully retard the ability of a medium just because it’s not what we are used to.
Hi def usually stands for the resolution of the frame, not the frame rate.
A standard definition video source will look pretty bad on an HD screen, specially a large one (One of the many reasons I make fun of so called “HD” consoles). That’s probably why the video looked bad to the OP. It was probably a lower resolution video, possibly even stretched out by an incorrect aspect ratio correction.
But, yes, the de-judder and frame interpolation of some TV’s can make film look more like a soap opera. I don’t like the effect either.
No it’s definitely the refresh rate. I recently bought a new TV with the higher refresh rate (LG TruMotion) and even watching an older movie through Netflix (The Ghost and the Darkness) there was a ‘soap opera’ feel to it. I turned it off and it works fine now.
Agree with the above, it’s the refresh rate. I noticed it when I got my 120 hz LCD TV about six months ago. Movies - either Blu Ray or even just broadcast HD looked eerie and “unreal”. I especially noticed that a lot of them looked like soap operas (which IIRC were shot on tape, not film, and had a different refresh rate).
That said, after a month or two of noticing this oddity, I’ve stopped noticing it all. So for me at least, the refresh rate took a bit of getting used to, but now everything looks fine and dandy.
I guess I don’t understand why the 120 Hz would make the picture seem faster/faker. Movies are shot at 24 fps, right? So that when they’re played on 60 Hz TVs some of the frames have to be repeated whereas others may be dropped, resulting in a weird fast or stuttering look. Why can’t the 120 Hz TV play each frame 5 times and not have it look soap opera-ey?
And it was so extremely obvious to me I’m not sure I could ever stop noticing it. Do 120 Hz TVs typically have a way to disable the higher refresh rate (if that’s indeed the culprit)?