HDR for games... way more important than 4K

There are some games, especially last gen ones that run at native 4K on the Pro, and even some new ones that do too, including Rise of the Tomb Raider. The rest are usually running anywhere from 1080p to 1800p, depending on the title.

They do use sparse/checkerboard rendering to get there, and yes for many games “4k” is exactly that: quote unquote “4K”. But I think the main thing they are trying to achieve is better image quality on a 4K panel, and they definitely do that. Even an effective 1440p render with checkerboarding on a 4K panel will look better than 1080p output.

It won’t look as pristine and detailed as native 4K of course, but it will look better than the base PS4 output.

So, is HDR tech strictly a question of getting blacker blacks, controlling light bleeding between pixels?

The desaturated brown & gray in AAA games will look great!

I can see how it might be used to great effect in VR although I don’t know if current/near-future headsets have HDR capability.

Should have added a smiley.

Some of us have just been through too many hardware wars, and I was pointing out the absurdly small stakes of some.

No–it’s about brighter brights. You’re probably thinking that’s the same thing, so let me explain.

An HDR display requires two things: higher bit depth (at least 10 bits per component, ideally 12), and a higher peak brightness (3x at least, ideally 10x). You might be thinking–my display is already bright enough. It doesn’t need to be 10x as bright. But that’s only true of the average brightness, not the peak.

Most sets are designed to output a comfortable white level. Have a scene in a white room, and it looks pretty normal–neither blinding nor too dim.

The problem is that any colors are necessarily going to be dimmer than the room, because white is the combination of RGB, and so if you just have one component, then you’re losing the brightness of the others. Blue is the worst in this regard; perceptually, it’s only like 6% of the total brightness contribution of white.

Sometimes this isn’t a problem. A blue painted object is necessarily going to be dimmer than white surroundings. But it is a problem for anything that emits light, like a neon sign.

The answer to this is to allow color components that are brighter than their contribution to white. You make full white be (say) 10% of the possible brightness, but allow saturated colors to go higher. Or even white itself, as long as you limit it to small portions that would be dazzling in real life, like a flashlight.

Of course, if you’ve made white be at the 10% level, then you’ll have a very dim screen unless you put in a very bright backlight. So that’s what they do. That would be wasteful since most of the scene won’t need it, so ideally the backlight is segmented such that most parts can remain at the normal levels.

You also need a higher bit depth, since you’re using a smaller portion of your range most of the time. If you stuck with 8 bits, then compressing to 10% means only 25 levels of brightness, which is not nearly enough. So you add a few bits to compensate.

Back in low-dynamic-range land, designers have a choice for what to do about bright, colored objects: they can reduce the brightness (reducing their impact), or they can desaturate. If you take a solid blue object and add some partial red/green, it will get brighter and still be bluish. But it won’t be the vibrant blue you wanted; it’s more washed out. So you can have either dim saturated colors or bright desaturated colors. Both look pretty crappy. HDR allows both at once.

As an aside, you might be thinking that with all this light available, it could actually blind people if you turned it on all at once. And that’s true–so the various HDR standards have limits on average brightness. You wouldn’t want some commercial turning on the full brightness to catch your attention. It could damage both the set and your eyes.

You are correct but there is more than that as well. A display that can handle HDR can also do “wide gamut” color, it can display saturation levels of colors that weren’t possible on older displays. Technically HDR and wide gamut are too seperate things, but in the consumer world they’re being bundled together as ‘HDR’.

For those that care about the details normal HD video uses the rec709 color space, HDR 4K is rec2020. You can see the difference in the ranges of color possible here:

Current displays can’t show the full rec2020 color space, but they still show considerably more than rec709.

That’s not clear to me at all, though granted I don’t follow every development in the consumer space. However, looking just at Sony’s TVs, they have their X800/X900 displays, which advertise HDR and wide-gamut separately, but also their X700 lines which has HDR but (apparently) not an extended gamut.

Of course marketers thrive on confusion and invent all kinds of crazy names (like “X-tended Dynamic Range” in Sony’s case), and you never quite know what you’re getting there. But technologically speaking, HDR is distinct from wide gamut. The former requires a high bit depth and high brightness. The latter requires a different set of primary colors (whether based on LCD color filters, OLEDs, or otherwise).

You are correct again, however the current consumer implementations of HDR (HDR10, DolbyVision and Hybrid Log Gamma) all use the rec2020 color space which is inherently wide gamut. So all Tv’s that are marked as ‘HDR’ can accept and “display” a wide gamut signal, however the actual extra gamut width more than a standard HD tv varies considerably, that depends on the panel specs. Again Rtings.com does a very good breakdown of rating various models and anyone considering buying a 4K HDR TV should do some research.

tl:dr: It’s a mess but in practise they’re combined, just some are wider than others. And yes the thread title really should be “HDR / Wide Gamut way more important than 4K for games”

A friend of mine, who is super extra into all this stuff, went out and bought a high-end 4K TV and a PS4 Pro. She very enthusiastically got some of her gaming friends (including me) over to try it out and I didn’t have the heart to tell her I wasn’t seeing any real improvement over a “standard” 1080p HD TV.

So in that case it’s possible that the game she was showing you was 4K but not HDR, OR that she bought a 4K TV that doesn’t have proper wide gamut support or its not set up correctly.* Seeing a HDR game on a proper setup it is literally blindingly obvious (the whites on glows and explosions are 3 times brighter than “paper white” which is the max a normal 1080p tv can do).

Resogun is a $10 indie game and it makes the best use of HDR of anything I’ve seen so far so its a good test case to see if you are setup correctly.

  • Things which can go wrong: You need a specific HDMI 2.2 cable for HDR 4K, earlier cables will silently fall back to standard 4K (they give you one with the PS4Pro but she might have just used the one she already had, frustratingly it looks the same). You may need to enable an option in your tv menus, on mine you have to enable “enhanced HDMI” which was NOT on as a default. And even worse, only 2 of the three HDMI ports accept HDR input, this is in the manual, but how many people read those?

On any of the HDR tv’s rated 7.5 or better by rtings.com it’s a big big difference.

I see what you’re saying but I think it’s all getting far, far too complicated for the average punter.

Basically, the average person walks into an electronic retailer and sees a 4K TV looks pretty awesome (and they do), but when you start going on about needing special HDMI cables and meeting certain technical standards their eyes glaze over.

Short version: From what I’ve seen, PS4 Pro + 4K TV isn’t a particularly mind-blowing combination.

I don’t have any opinion on all this stuff. I’m sure they’re all nice TVs. But I can’t help pointing out that nothing in this thread is “important” for gaming.

I’m about to dive into Stardew Valley. Last night I was playing Titan Quest Anniversary Edition. Later on, I’m going to finally wrap up the Main Quest in Skyrim. I’m also fiddling around with Fallen London and Regency Solitaire. I also have a Cities: Skylines game I want to get back to.

All fine games. 4k and HDR and 70" screens - these are not things that are “important” for gaming. Nice to have, certainly, but not in anyway necessary.

It really depends on what sort of games you like. For some people, hi-res and large screens are very important.

You are absolutely correct and as someone that works in graphics / video its rather frustrating that the industry has botched this so badly. The big marketing push has been “4K” which really doesn’t make that much of a difference, and HDR / wide gamut was seen as an “oh yeah we also got this” when it actually is much more noticeable on a proper setup.

Especially because in many cases, 4K is actually “too good”, for want of a better term - you can sometimes see makeup, obvious sets/props, things like that which weren’t apparent before.

To be fair this is a temporary problem, mostly seen on older TV series that were later remastered and re-released. In theory anything that had a theatrical release should already have been held to a standard where such things were fixed. Nowadays the post production process happens at higher resolution throughout the workflow so issues like this are spotted and corrected before broadcast / screening.