Went to a friend’s recently and watched some movies. He has a gigantic new top-of-the-line flatscreen TV. We watched Fargo. It looked like it was filmed with a home video camera. We watched Ferris Bueller’s Day Off. It looked like it was filmed with a home video camera. We watched Rushmore. It looked like it was filmed with a home video camera.
This is not the first time I’ve witnessed this. I can remember being at someone’s house where Game of Thrones was on and even THAT somehow looked like it was filmed with a home video camera.
My own TV, an LG 42" flatscreen, doesn’t do this, for whatever reason. But others that I see do.
It has something to do with the frame rate, from what I can discern from reading about this phenomenon elsewhere. But I can’t understand why this is…maybe I’m just dense (not out of the question by any means), can anyone here break it down for me?
It’s a feature called motion smoothing. It’s intended to make live, fast-moving content like HD Sports less blurry. Unfortunately, it has a bad tendency of messing up movies that were shot in 24 fps.
However, I do feel the need to point out that a major factor in the “video camera” look is that video cameras shot with a higher frame rate than movies are. So part of the effect is people being conditioned to believe that a higher frame rate (which, all else being equal, means a higher quality picture) is indicative of lower quality.
Perhaps your friend just tinkered with the settings and turned the sharpness up too high? Or did what you see definitely look like a problem with the refresh rate? Did it make a difference what the video frame-per-second rate was (24 vs 48 vs 60…)?
If the HD sports were shot with a fast camera at a fast frame rate, wouldn’t they be less blurry to begin with?
What was the source of the videos you were watching? Cable (what company), DVD, BluRay, internet stream?
Generally when I what I think you are describing it can be a scaling artifact from playing a low resolution video on a high resolution TV. Frame rate issues from other replies too…
And as I learned a few years ago when I complained about it at a friend’s house, not all people “see” it. My friend had no idea what I was talking about.
Your friend needs to turn off “Motion Interpolation” on his TV’s setup menu. Don’t know what kind of TV your friend has, but depending on the brand, it may be named something other than “Motion Interpolation.”
On this Sony TV it’s called MotionFlow and should be on Cinema View to turn it off, not settings like Smooth.
If your TV is 120Hz and the recording is 24Hz it has 4 missing frames per good frame of video. It can quintuple the image, making it look natural but waste potential of your TV. Another option is to do some interpolation algortithm, and some work better for other purposes that others.
jasg is asking the right question and mentions a very common issue.
Probably not the OP’s situation given the programs, but one situation we have with are cable is that there are SD and HD versions of many channels. Some people tune to the SD one and just flat out don’t see the obvious downgrade issues!
If there’s a DVD/Blu-ray player involved, it could be connected to the TV via a lower-res path than it really should. E.g., good old RCA cables from a DVD player to the TV.
Even with Blu-ray and HDMI cables there are all sorts of ways non-tech people can set things up to downgrade some things.
I am honestly unsure. It would not surprise me in the slightest to learn that cable and satellite TV standards are tethered to the assumption of a 24 fps frame rate. Stupid things like that have a way of getting themselves embedded in standards and getting rid of them can be difficult when you have a number of different technologies in the distribution chain that all need to be updated to accommodate a better standard (in this case, the HD cameras, the infrastructure to get the signal from the game to the provider, the provider’s encoding mechanism and the decoder at the consumer end).
When you have a number of entities that all need to cooperate to improve the technology, it can be difficult to get any of them to make the first (and potentially expensive) step to upgrade if they don’t have any assurance that the rest of the chain will upgrade. It does happen, of course, or else we wouldn’t have HD in the first place, but there often needs to be a clear and compelling consumer demand to make it happen.
In the US, broadcast, cable, and satellite are at 29.97. Yes it’s stupid, especially when almost everything is shot at a different frame rate–24 is the standard now, because it retains a “non-video/soap opera” look, even when forced through some kind of tortuous conversion.
I don’t know what frame rate is used for streaming platforms or if it varies.
It varies. YouTube, for example, accepts video uploads at any frame rate—according to their website, “24, 25, 30, 48, 50, 60 frames per second (other frame rates are also acceptable)”—and can presumably stream them to viewers at the same frame rate. The technically difficult thing about streaming is that, when done right, it should adapt the flow of data to the user’s connection, which means automatically lowering the resolution, frame rate, and/or compression for slower connections. So even if you choose to stream a 60fps video from YouTube or some other service, you might end up getting a version of the video at a much lower frame rate.