HDR for games... way more important than 4K

I just managed to get my PS4Pro (they’ve been on back order where I am) and hook it up to my Bravia x800d 4K HDR TV.

So some of the games I have support 4K but not HDR, eg Shadow of Mordor and Abzu. They’re very pretty but not mind blowing. Then you start up something with well done HDR, like Resogun, Last of Us Remastered DLC or Infamous Second Son and it’s jaw dropping. The way bright colors bleed and overlap into other areas and seem to wrap around objects is unreal (or rather it’s more real to what happens in our eyes).

Unless you have a massive 70 inch plus TV you’re hard pressed to see the difference between 1080p and 4K at typical couch distance, but HDR vs normal is very very obvious on all screen sizes. Definitely check it out if you have a chance to see a good setup somewhere.

(And for those who don’t know, HDR is high dynamic range color, eg brightness levels much higher than a normal display, more detail in dark areas and ability to display certain colors that just weren’t even possible before , eg wide gamut color).

Yeah, I really don’t see the point of 4K for a typical TV room. I have a 65" TV and view it from around 9’. I’m probably not going to see a difference with 4K and I feel the TV size is fine, I don’t want more screen in front of me. There is also the issue of a lack of content, but that will change. The ability of your eye to see the increased resolution won’t though.

This is not entirely correct. Take a piece of A4 / letter sized paper with text printed at 300 dpi, 600 dpi, and 1200 dpi. You can tell that the 600 dpi text looks nicer than the 300 dpi text, and the 1200 dpi text looks nicer still, but you can’t spot the pixel difference. It’s the same between HD and 4K.

About how much more computationally expensive is HDR vs ordinary? Between two displays of similar characteristics, one HDR and one not, what will the or % premium be for the HDR one?

Is it the same? If you can see the difference between 300 and 1200 dpi then you are seeing the additional resolution, even if you are not conscious of the dots. Does a 65" 4K TV actually look different at 9’ to a 1080p one when you control for other factors?

Anyway, that wasn’t the point of the part I quoted. The 4K content will improve overtime but our eyes will not, so if we can’t see the difference then it is pointless buying one, ever. If however we can see the difference but there is limited content, it is not necessarily pointless buying one though it’d be better to wait for prices to come down and content to improve.

I have a 55" HD TV myself (Panasonic VT series) and have seen a 4K TV in a shop displaying full non-HDR 4K content - played from a USB stick, not reduced-quality broadcast crap - and the difference is clear. And impressive. I’ve also seen 4K content that’s been compressed to heck and back, and it’s not as good as HD.

Absolutely the source makes a difference, but at least with games rendering in 4K compression is not a problem. I did a few more tests with Shadow of Mordor (which is 4K but not HDR) at couch distance with my 55 inch TV I can see the difference in detail in the mountains if they’re static and I look closely. When things are moving fast or the camera is moving? Pretty hard to spot.

Toggling HDR on and off on Infamous Second Son, on the other hand, is night and day.

One of the things that the industry seems to be messing up when it comes to HDR is that there is no solid spec for what wide color gamut and brightness levels a TV needs to have to call itself HDR so a lot of the cheaper ones with a HDR badge are not much of an improvement. When I was researching a HDR tv to buy, I found the Rtings.com site by far the best for checking if a TV was “really” HDR or not.

You’re still going to see a significant improvement in detail and sharpness on a 4K TV running native content, unless you have a very small 4K TV or are sitting way back from it.

Especially if you’re comparing 1080 to 4K content. But you’re right that it doesn’t take native 4K to get an improvement over 1080p. the Ps4 Pro is clever in that way, rendering most games above 1080p, but not at native 4K resolution. the increased bump to 1440p or higher is enough to improve the quality of the image over 1080p on a 4K TV. It’s just not going to be as detailed and sharp as native.

HDR is pretty awesome though. And it’s not computationally intensive at all, and easy to implement in games to boot. IIRC internally, games are HDR internally already, you just need to do proper tone mapping to support the external API’s.

Can’t wait for HDR monitors to hit, as I probably won’t be in the market for a new HDR TV for a while.

Be careful, what is on the box isn’t necessarily what si actually being displayed.

Shadow of Mordor on PS4 Pro for example is NOT native 4K. It uses dynamic resolution scaling going from as low as 60% of 4K up to 90% at times depending on what’s happening on screen. In addition, many game son the Ps4 Pro will be using sparse (checkerboard) rendering, which will introduce some artifacts and softness to the frame as well. It adds up. But the end result should still be much better looking than a 1080p image upscaled to 4K.
And yes, HDR has now become a meaningless badge. HDR is a function of supporting the incoming signal, sure, BUT, most importantly it’s a function of the panel. You need a panel capable of outputting the full color spectrum of the spec plus the luminosity.

Basically the only thing that comes close to true HDR10 is a VERY EXPENSIVE OLED panel. If you don’t have that, you’re not getting full HDR.

Panels of lesser quality will give you diminishing returns. Cheap HDR TV’s, are essentially NOT relaly HDR.

You can get HDR tv’s now for around $650 that support an impressive range of wide gamut where you can really see a difference. If you check rtings.com you’ll see that the Samsung KS8000 and the Sony Bravia x800d or x850d both do pretty well in their wide gamut coverage.

List here:
https://www.reddit.com/r/PS4/comments/5258ac/a_compiled_list_of_4k_uhd_tvs_that_support_hdr_10/

Ooof I forgot how input lag is also an issue with HDR on TV’s. Be careful and do your research before purchasing.

Happy to see better support at lower prices, but you’re still not getting full HDR 10 with some of these panels. You’ll see a difference, but not anything like the full spec.

You were the guy with the oscilloscope, back in the day, proving that Beta had 220 lines of resolution while VHS could only manage 195, weren’t you?

Pretty sure if you were to compare a $650 HDR TV panel with a $5,000 OLED, you’re going to notice the difference.

So no, we’re not talking about 25 lines of resolution. Whether the difference is worth the price or not, or if the $650 provides a good enough experience is not what i’m discussing and it’s a very subjective thing anyway.

Not much, in principle. In fact it could be less.

Games have been doing all of their internal rendering in floating point for years now (usually 16 bits per component). And they try to use a somewhat physical lighting model which can lead to very bright objects (like the sun).

Since LDR displays are the norm, games then apply a “tone mapping” pass which reduces the HDR input data to something that can be displayed. These passes can be complicated–they have to account for exposure (dark cave vs. outdoors), contrast (don’t want to let a single bright object blow out your entire contrast range), bloom/glare effects, and even subtle physiological effects like how it tends to look bluish at night.

Some of these things don’t need to happen if you can just display floating-point data directly, and some (like exposure) are simplified. So in principle, HDR rendering could be less costly than LDR, though in practice the difference is probably small.

I’ve seen a couple of HDR displays and they do look amazing. You kinda forget how much saturation you lose with current tech. Normally, bright objects tend to just get blown out to white. This doesn’t just reduce detail, but color saturation as well–that orange and blue flame just looks like a white blob. With HDR, you can see proper colors no matter how bright the object.

Yes 4K is awesome, in a shop, when you are standing next to the TV. The question is how does that translate to your living room at normal viewing distance? And how much of the awesomeness is because 4K TVs, being flagship models, have better specs all round?

To really see if it is worth buying one, you need to compare 4K content and non upscaled 1080p content on the same TV at typical viewing distance.

Absolutely you will, but the point I was trying to make in this thread is that right now with consumer 4K HDR TV’s in the $600-$1000 price range HDR looks amazing for games and is a bigger deal than 4K.

Kinthalis, I suspect you work in video… I also do, I go to the NAB tradeshow in vegas every year and I’ve seen $100,000+ displays many times. I still think my $700 Sony bravia looks pretty good.

Please. I was looking at it at a proper viewing distance.

My Panasonic VT was a flagship model in its day.

If 1080p content were not upscaled on a 4K display you’d get a little picture in the middle of the screen.

Ok, I meant upscaled but without interpolating pixels. All I’m getting at is that to find out whether you can see the difference in resolution, you need to control for factors other than resolution (rather than comparing a flagship model from however many years ago to a flagship model from today). In other words you need two TVs with the same specs aside from resolution. I was just thinking that you could use the same TV if you could prevent it from interpolating a 1080p picture to make it look artificially better than 1080p.

If you can genuinely see a difference on a 65" TV at around 9’, then cool, 4K really would be worth it for me. I’m just not convinced by “I saw one in a shop and it looked great” for obvious reasons (and I’ve seen them in the shop too).

The only info I’ve found on the net is either hopelessly non-scientific or entirely theoretical.

It’s quite possible that our next TV will be 4K but that will most likely be because 1080P TVs are no longer available in the size and feature set that I want. It will be like getting a smart phone that comes with a camera when you don’t want a camera.

Graphically intensive games on the PS4 pro aren’t really rendering in 4k. Usually not even close. They’re just using fancy upscaling techniques to display a 4k output image, but you could do the same thing to an atari 2600 if you wanted to. Maybe there are some low intensity indy pixel art type games that actually render in 4k, but no AAA titles certainly.

4K is a really dumb thing for consoles to chase at this point anyway. It’s a marketing gimmick, trying to cash in on the “4k” term. They’d be much better served using the same power to render a lower res image like 1080p at a better frame rate with higher quality settings. 1080/60/ultra is a way better target than 4k/30/low. Especially at the typical viewing distances for TVs where the added detail might be minimally observable.

I’ll take your word for it, I haven’t really spent time to check out the various new displays. But this sounds like good news.

I was worried good HDR would be out of my price range for a long while :slight_smile: