I thought it was a little simpler and less collusion-y: studios used to sell their new release videos for high prices (like a hundred bucks) for the first few weeks, and then dramatically drop the price later when they wanted to sell to the general public.
I feel that way myself, even though I’m only a few years under 50. That’s why I mostly rent Blu-rays from Netflix, which are HD, and also why I hate that their names for the disc side are “DVD Netflix” and “DVD.com”. Not smart marketing/branding.
[quote=“bump, post:60, topic:815950”]
I’m a little over fifty and watch movies on a 46" set from about ten feet. Honestly, I don’t see much difference between DVD and Blu Ray quality.
I’d just like to point out that, as a Olde Farte, youngsters have a right (nay, an obligation!) to laugh at us. It’s how the world works.
So go ahead, ya whippersnappers, make fun of me loading mp3s onto my iPod Mini and watching Harvey on VHS. At least I’m not wearing my hipster scarf and lumberjack shirt and skinny wool pants when it’s 80º out…
That makes sense. According to this TV engineer who helped develop HDTV, at nine feet away from the screen you need a 69” TV to see the detail found in Blu-ray or any other 1080p picture:
https://hdguru.com/lechner-distance-the-number-you-need-to-know-before-buying-an-hdtv/
…and we in turn get to laugh at them for drinking the Kool-Aid, every time, without fail… whether it’s blue ray, iphones, streaming music, etc. etc. they always go for it, hook line and sinker. Just as if they were programmed.
[quote=“SlackerInc, post:65, topic:815950”]
I’m strongly tempted to take that as a reason to upgrade to a newer, bigger set.
That makes sense. According to this TV engineer who helped develop HDTV, at nine feet away from the screen you need a 69” TV to see the detail found in Blu-ray or any other 1080p picture:
https://hdguru.com/lechner-distance-the-number-you-need-to-know-before-buying-an-hdtv/
I’ve always found this to be a stupid argument. And I’ve worked in video for 27 years. Having “too much” resolution is a feature, not a problem. Printers print at insanely high pixel densities for best quality. Photography the same. Can you see each pixel? Who cares!
I’ve always found this to be a stupid argument. And I’ve worked in video for 27 years. Having “too much” resolution is a feature, not a problem. Printers print at insanely high pixel densities for best quality. Photography the same. Can you see each pixel? Who cares!
I think the point is that, at normal viewing distance and a 46" screen, you’re not going to be able to see a fly’s balls from your couch so why pay extra for a TV that shows the fly’s balls? It may be “a feature” but it’s a feature you’re not actually getting any benefit from for the cost.
I think the point is that, at normal viewing distance and a 46" screen, you’re not going to be able to see a fly’s balls from your couch so why pay extra for a TV that shows the fly’s balls? It may be “a feature” but it’s a feature you’re not actually getting any benefit from for the cost.
Because all those pixels still help make the picture look better. Look up anti-aliasing. Mobile phones have resolutions from 300-450ish ppi. There’s no way in hell you can see all those pixels, but I never hear stupid arguments about there being too many of them when it comes to phones, just pseudo-sciency arguments about televisions having too many pixels. It’s a crock.
I think where the rubber hits the road on the resolution argument is whether people can reliably see a difference (one they prefer) in blind “taste tests”. Squeegee, can you point to any data of this kind to back up your argument?
…and we in turn get to laugh at them for drinking the Kool-Aid, every time, without fail… whether it’s blue ray, iphones, streaming music, etc. etc. they always go for it, hook line and sinker. Just as if they were programmed.
I love my Blu-ray player, iPhone and streaming music app (Apple Music)!
Squeegee, can you point to any data of this kind to back up your argument?
Cite. Every unit in that chart would need a Lechner distance of something like 1 foot. Yet people like them better than HD.
And seriously, the 9 foot thing is the real crock. I move around my viewing room with the television on. Why shouldn’t I get more detail as I get closer? That’s how non-video-life works, why shouldn’t it be that way for video?
Because all those pixels still help make the picture look better. Look up anti-aliasing.
I’m aware of anti-aliasing. I’m also aware, from personal experience that there’s diminishing returns from a viewing perspective. Phone screens aren’t really a comparison since people often have their nose in their screen unlike how people usually watch the family television.
But hey, if it’s all bullshit then the companies actually making these televisions would want to let us know, right? So we can all buy 4K televisions from them and enjoy their glorious images?
How close to the TV must I sit to appreciate 4K?
The short answer is that between 5 and 6 ft. is the ideal viewing distance for a 55” or 65” Sony 4K Ultra HD TV.However, on a 55“, you can now sit as close as 3.6 ft and enjoy a visibly smoother and more detailed picture (e.g you won’t see the individual pixels). On a 65“ TV, you can sit as close as 4.2 ft. to appreciate 4K.
Since the pixels on a 4K Ultra HD TV are four times smaller than on an HDTV, it becomes harder to see them. In fact, you can sit closer to a 65” Sony 4K Ultra HDTV than you can to a 35” HDTV. When you sit closer to the screen, you feel like you are immersed in the action. You get the same visual impact as you would sitting in the best seat at a movie theater.
That’s taken from Sony’s FAQ regarding one of their 4K televisions.
Cite. Every unit in that chart would need a Lechner distance of something like 1 foot. Yet people like them better than HD.
Yeah, that doesn’t remotely back you up. People naturally assume that bigger numbers are better and most new televisions are 4K these days so you’re likely to buy one regardless of whether or not you’re using it to its full potential.
Nonetheless, for the overwhelming majority of consumers with a home television viewing distance of six feet or greater, the benefits of increased screen resolution will be recognized when viewing 4K content on a display 55 inches or larger.
[…]
On a 50-inch 1080p HD display, most consumers can begin to distinguish individual pixels only when standing within six feet of the screen. Therefore if your viewing distance is 10 feet or greater, an Ultra HD 50-inch display will likely have little perceived benefit in terms of image clarity and sharpness – which can be attributed directly to the increase in pixel count. On the other hand, if you work in a field such as graphic design and sit approximately two feet from your 32-inch display, even at a screen size of just 32 inches, the benefits of 4K resolution will likely be noticeable as a result of the narrow viewing distance.
Again, sitting at typical viewing distances will decrease the clarity of an average sized television but it’s still useful for nose-in-the-screen applications or viewing.
in a blind test hosted by enthusiast website HDTVTest.co.uk, a far-from-gigantic 55-inch HD TV was pitched against a 55-inch 4K TV, and 48 out of 49 people correctly identified the Ultra HD TV from its picture quality alone from a perfectly normal viewing distance of nine feet.
“Television enthusiast blog hosts experiment at high end television showroom” sounds less like “empirical evidence” and more like a chance to sell televisions. Given that this actual scientific paper (PDF) found that people identified UHD content 55% of the time at a viewing distance of nine feet, a 98% score in the TV showroom seems rather suspicious. Doesn’t have to be intentionally rigged; could just be a poorly done experiment that missed something.
Under these conditions and over this set of sequences, on average, on 54.8% of the sequences (17 out of 31), 4K UHD resolution content could be identified as being sharper compared to its HD down and upsampled alternative. The probabilities in which 4K UHD could be differentiated from downscaled/upscaled HD range from 83.3% for the easiest to assess sequence down to 39.7% for the most difficult sequence. Although significance tests demonstrate there is a positive sharpness difference from camera quality 4K UHD content compared to the HD downscaled/upscaled variations, it is very content dependent and all circumstances have been chosen in favor of the 4K UHD representation.
If you don’t like 4k (or HD for that matter), don’t get it. But why waste breath telling someone else they shouldn’t like something, because Science! ? :dubious:
“Seriously, that thing you like? You shouldn’t like because it’s too capable!”
I don’t own a 4k TV, yet I think this is elitist nonsense. It’s was a tired argument 10 years ago. Let it go.
My mistake above, the scientific paper actually states that their viewing distance was 1.5x the height of the screen so considerably closer than 9’. Amusingly, I mistook a section where they reference the TV blog’s test and say it was at nine feet as part of the test’s methodology.
If you don’t like 4k (or HD for that matter), don’t get it. But why waste breath telling someone else they shouldn’t like something, because Science! ? :dubious:
I don’t care what you buy. As I mentioned above, it’s largely a moot point with new TV purchases anyway because 4K is becoming the standard (if it isn’t already). But saying “Science is wrong because nuh-UH!!!” doesn’t do anyone any good.
I don’t care what you buy. As I mentioned above, it’s largely a moot point with new TV purchases anyway because 4K is becoming the standard (if it isn’t already). But saying “Science is wrong because nuh-UH!!!” doesn’t do anyone any good.
It’s not science. How people perceive what they see is not science, it’s perception, and is not entirely understood. Using sciency arguments to tell people what they’re perceiving isn’t science.