We Need More HDR In Our PC Games

Avatar image for xanadu
xanadu

2157

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Edited By xanadu

I recently bought a new 4k HDR TV and I have been having a lot of fun enjoying all that #content! However one thing that has become alarmingly aware to me is the almost non-existence of HDR support for PC games. Also considering my gtx 970 showing its age tempting me to pick up a 1080 and not wait for the new "volta" cards in early 2018, the lack of HDR support is a huge bummer. Even in the brief time I spent testing my 970 at 4k resolutions, I could easily tell that HDR is the real reason to upgrade an older TV. 4k resolutions look nice and crisp and makes Anti-Aliasing obsolete, but HDR is a stunning feature that showcases light and color in a way I've never seen before. I'm running a lot of these games in 1440p with HDR turned on and thoroughly enjoying these new visual experiences.

No Caption Provided

The shadows in Destiny 2 are so dark the TV looks off in areas shaded from the sun. Camera Flashes in Hitman's fashion show level are stunning as they are disorienting. Under certain conditions like nighttime and heavy rain storms, Forza 7 looks like a window into reality. I can type up countless examples and metaphors to try and explain how HDR effects visuals in games, but like virtual reality, it's something you really have to see for yourself. But here at lies the problem with HDR gaming on PC, it's just not getting the same support we see on consoles. Right now there are a wopping 24 games that support HDR on PC. Don't worry though, the extremely popular title: Chess Ultra has HDR support...

No Caption Provided

Earlier in November, Assassins Creed Origins saw a title update for 4k and HDR. The initial announcement implied the title update would be coming to all platforms but it quickly became apparent that all platforms meant Xbox One/PS4. Wolfenstein 2 has come and gone with no HDR support and no explanation as to why from the developer (that I could find). Major games like Assassins Creed and Wolfenstein missing what I would consider "key next gen features" is frustrating to someone who prefers to play on the PC. The lack of HDR is counter intuitive to the philosophies of PC gaming. With the PS4 Pro and even the mighty Xbox One X, you have to make choices. 4k or Frame-rate? You usually have to choose which of these features you want on the consoles. The PC allows us to not have to choose, giving us the best of both worlds. I am certainly not a game developer so I have no idea what it takes to add HDR into games, but I wish more developers were taking HDR seriously on the PC. With Assisins Creed this year (a game I would otherwise skip if for not the critical reception), I have to make a choice between playing on PC at 1440p 60fps, or with HDR but 30fps on consoles. This is a weird decision to have to make. Wolfenstein at least runs at 60fps on all consoles but the only way to experience the game in 4K HDR is on the Xbox One X.

I understand HDR is still relatively new and we should see stronger support in the future. But right now, it's disappointing and a somewhat complicated issue. Right now all we have to look forward to for HDR On PC are ports of current console games: Final Fantasy XV and Injustice 2. At this point, I would just be happy with even more patches for older games that already have HDR support on consoles. Until stronger support arrives on PC, I guess i'll just stick to Destiny 2 and Forza 7 for my HDR on PC needs.

Also, if you want to get super drunk: take a drink every time I said HDR in this blog.

Avatar image for deactivated-5d1d502761653
deactivated-5d1d502761653

305

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

This PC Gamer article explains why HDR on PC is basically not a thing yet. Will probably take another 2 years before you can expect the main challenges that hold it back atm to have been addressed and you see more HDR supporting monitors at non-crazy price points.

Till then I would recommend you to check out reshade which allows for some noticeable improvements to graphic fidelity.

Doom default and with Reshade

No Caption Provided

No Caption Provided

Avatar image for morten81
mORTEN81

93

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By mORTEN81

Agreed.

And in more cases when they do actually support HDR, somehow it becomes borked after a new videocard driver release or windows creator update or something. I wont even try to guess how many hours I've spent trying to get Andromeda to work with HDR, before I finally caved in and downloaded an older driver that was said to be working. The list of games is bafflingly short and growing very slowly.

The Windows implementation is stupid too, as you're required to enable HDR in the settings every time, unless you're OK with your desktop looking like grey washed out goo the rest of the time.

That's not even taking into account HDR10 or Dolby Vision, and different windows settings. RGB limited, RGB full, 8-bit, 10-bit or 12-bit, ycbcr 420, 422, or 444, the latter of which isn't even supported at 4K/60 with current HDMI cables. It's just about as user unfriendly as it gets and only re-opens that age old, tired debate of PC gaming being complicated. It used to be, then it wasn't, and now it is again. At least if you want the best possible result.

The stuff need to be in a state where it just works without tinkering for half an hour with my tv settings, wondering if it's the tv, the pc, or the drivers that are messing things up.

Avatar image for howardian
Howardian

213

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

As a PC player of over 20 years, I dunno what that is and neither do I need it, but I wish you luck!

Avatar image for efesell
Efesell

7498

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

As a PC player of over 20 years, I dunno what that is and neither do I need it, but I wish you luck!

In here with the 'I don't need an HDTV my TV looks great' memories.

Avatar image for howardian
Howardian

213

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@efesell said:
@howardian said:

As a PC player of over 20 years, I dunno what that is and neither do I need it, but I wish you luck!

In here with the 'I don't need an HDTV my TV looks great' memories.

I don't have a TV

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@xanadu:

HDR is a meaningless marketing buzzword designed to sell you things that have not improved much. Of course it's not always an outright lie, and there is tremendous value in realizing that the most devious lies are of the sort of slight warping and spin, that is kind of true.

Because of the openness of the PC platform it has never been fast to adopt new standards widely, A PC merely can produce a lot of compute power, if you decide to compromise your sanity, wallet and electricity bill. Your appeal to elitism and excellence is maybe only relevant to recent PC trends, like ROG Gaming brands and similar nonsense. Also the PC is now run by giant monopolies, which are generally slow to move and quick to overcharge. Even still, the early adopter pain is real, but so are first world problems.

Higher resolutions only serve to increase the size of screens! <full stop> (and they don't make AA obsolete, not even 16K would on a 24" screen), Monitors are smaller, which is why they don't have segmented back-light-arrays like Gigantic TVs, which is the way LCD TVs can make use of HDR. There are also no OLED monitor panels yet, for one (good) reason or another. Using a TV as a monitor just wasn't meant to be, because of input lag, HDMI BW limits and generally different priorities.

You can turn your PC into a glorified TV set-top-box "Steam-machine", that could put out 4K@60Hz but that's the wrong dream to chase, HDR support may just be another straw. If conspicuous consumption is a goal then TV+latest box is still the way to go, maybe along side with a sneaker collection and Taco Bell wedding. J/K

To reference reality, in my country publicly funded TV partially switched to a 1080p signal in 2017, which makes THIS a good year to buy a modestly sized FullHD TV. The 1080p FullHD logo was revealed 2007, so there is about a decade between marketing promise and reality. Given the technical challenges with OLED, 10 years for wide something-like-HDR support is a rather bold estimate. Even though LG and Samsung are probably eager to sell Panels that burn out after 3-5 years instead of 10-15. What is the MTTF on those OLEDs anyway.

Avatar image for mikewhy
mikewhy

595

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Wow this thread really went off the rails, when it really shouldn't have.

Nvidia cards have supported HDR since the 900 series (2014), but it looks like AMD and Intel have only started to support it recently.

I wish it was in more PC games, and was disappointed when Ubisoft changed their wording and didn't deliver HDR on PC in Origins.

I recently upgraded to a 4K/HDR TV. While my shitty eyes can't really make use of the extra resolution, boy do those colours ever pop.

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I just hope they fix it so it works proper more often, on pc i can only get it in certain resolutions while others make the picture half size if i am not in 4k res. I haven't updated to the hdr windows 10 yet though but i heard that made it even worse.

Also hdr isnt a meaningless buzz word lol, its just some cheaper sets are not actually hdr but hdr compatible meaning they can receive the signal but not display the full brighness and color.

Avatar image for tuxedocruise
TuxedoCruise

248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Until it becomes so ubiquitous that it's a standard feature in any monitor or TV you buy, I'm mostly apathetic about HDR.

Yes, I have experienced it in person, and it does have a noticeable difference. But it doesn't improve the image, or enhance my gameplay experience that dramatically to make me demand it in every PC game. It's not as big a leap as going from SD to HD, from 60hz to 144hz, or even from 1080p to 4K.

I already have a 4K TV, and I have 1440p 144hz monitors for PC gaming. At the moment, HDR isn't as widespread or mature enough for developers to use to dramatic effects for it to warrant any hardware upgrades.

Avatar image for applegong
applegong

464

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Certainly came a long way from the bloom effects from Half Life 2: Lost Coast

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#11  Edited By Eurobum
@oursin_360 said:

Also hdr isnt a meaningless buzz word lol, its just some cheaper sets are not actually hdr but hdr compatible meaning they can receive the signal but not display the full brighness and color.

Example peak brightness of 1000 nits, that would actually make a monitor more eye straining, especially when you quickly change from bright to dark. It's sort of desirable for a TV because you'd want it to be view able in broad day light, but just like I don't want my ears drums to rip and bleed I don't need my eyes to have a burnt in afterimage, when someone throws a flashbang in game. Inevitably all monitors like phone screens would have to auto-adjust brightness, thus not actually using the entire brightness range.
LED LCD panels happen to be very bright, now, not OLEDs though. Is this why Sony is pushing HDR, is this the time to shine for full array backlit LCD LED TVs?
I already explained "meaningless", a "buzzword" is something that people keep repeating without being able or willing to come up with a synonym on the spot. Nothing personal, the ambiguous nature of these ambitious and vague umbrella terms is what it is, nobody knows the future or which standards or economic interests will prevail.

Avatar image for shagge
ShaggE

9562

Forum Posts

15

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

Certainly came a long way from the bloom effects from Half Life 2: Lost Coast

I still can't follow this whole HDR trend without picturing everybody plunking down money for bloom effects. "I just played Serious Sam 2 on my new TV, and everything is solid white from the HDR! It's beautiful!"

I get my amusement where I can, I guess.

Avatar image for cikame
cikame

4463

Forum Posts

10

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

As a PC player of over 20 years, I dunno what that is and neither do I need it, but I wish you luck!

Same, i don't want to keep noticing it, "oh look that there's HDR, it made that one bit look maybe better but over there i think it makes it darker or somehow worse", and after a while i'd probably turn it off for an overall more balanced image, but that's speaking as someone who hasn't see it in action yet, i'm the kinda guy who turns off various effects like Ambient Occlusion because i notice it and i don't want to.

Avatar image for xanadu
xanadu

2157

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Well I had no idea people were so adverse to something so simple as hdr...you all are acting like you're gonna get eye cancer from looking at a tv with hdr. I guess I shouldn't be surprised by the internet...

Avatar image for meierthered
MeierTheRed

6084

Forum Posts

1701

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@xanadu said:

Well I had no idea people were so adverse to something so simple as hdr...you all are acting like you're gonna get eye cancer from looking at a tv with hdr. I guess I shouldn't be surprised by the internet...

You really didn't need the last section of that comment. Its a valid opinion, i personally don't find HRD that mind-blowing. It does look nice to some extent, but it's not something that i have to have in all of my games.

Avatar image for xanadu
xanadu

2157

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@xanadu said:

Well I had no idea people were so adverse to something so simple as hdr...you all are acting like you're gonna get eye cancer from looking at a tv with hdr. I guess I shouldn't be surprised by the internet...

You really didn't need the last section of that comment. Its a valid opinion, i personally don't find HRD that mind-blowing. It does look nice to some extent, but it's not something that i have to have in all of my games.

Sarcasm has failed to translate through the internet again.

Avatar image for morten81
mORTEN81

93

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

That's all subjective. To me it makes a ton of difference in certain situations - not so much in others. I want to be blown away when there's an explosion in my single player action game or squint when I walk out of a dark cave and the sun hits my eyes. It adds to the immersion for me. Not so much if it's competitive multiplayer, for example.

I don't understand why a wider range of colors, with more brightness, are considered unnecessary or a buzzword for some. Nobody's saying that it has to always burn out your eyes, just that's it's a possibility to use those wider ranges if deemed necessary by content creators. Now you might not think it's ready for mass market, too expensive or complicated, but that's a different discussion. Right now it's definitely too complicated on PC, so I'd recommend that most people wait.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@morten81: Please correct me if I'm wrong.

So, the higher dynamic range extends the usual 8-bit range of 256 shades of grey or brightness to 10-bit or 1024 shades. This is an accuracy improvement especially for pictures and film, for games this primarily means even bigger textures and even more bandwidth lost/spent. I don't know the relative cost and the performance hit this generates is it (4x?), however I do know that Nvidia was first to use aggressive color compression to mitigate the quickly ballooning memory bandwidth requirements in the previous generation. Meaning video game colors are compressed down anyway.

Just like the resolution hike, this thing seems like another dumb improvement, that comes at unreasonable cost. In some sense it is inevitable that we move to higher resolutions and higher depths, but it is merely a scaling up that will take decades especially given that hardware progress is slowing down. From pictures to movies to video games, games will be last to adopt it because of the performance cost.

Also given how ridiculous video games look anyway, color accuracy should matter more for photographs/ screenshots. The way that monitor pixel/lights persistently glow and change color in a moving picture, they show accurate color for any given pixel perhaps for a couple of miliseconds out of a 16 ms frame and spend most time in transition (between colors). Consequently a strobing backlight monitor should do more for color accuracy of moving pictures than 10 bit brightness mapping.

Avatar image for facelessvixen
FacelessVixen

4009

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

I like using ENB, ReShade and SweetFX. It makes me feel smart when I get it working. But, it would be nice if more games included a contrast slider.

Avatar image for morten81
mORTEN81

93

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20  Edited By mORTEN81

@eurobum: I honestly don't know enough about it to say anything about bandwidth usage. I think I read somewhere it doesn't matter that much (x4), but I could be mistaken.

I've played a few games where it didn't matter from a performance perspective (or I didn't notice) and one that did, that I can remember. I think Hitman ran a bit slower on my machine with HDR on.

But I would argue that it doesn't matter, seeing as you could just turn off HDR if you wanted to.

Now if HDR ever becomes the standard I can definitely see the arguments against it, if it is indeed very resource intensive. But then I would take HDR over, for example, improved graphics, seeing as that's not that important to me at this point. It used to be, but now I think we're seeing diminishing returns on that front. At least the "return" is smaller than the improvements HDR makes, in my opinion anyway.

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Hdr shouldn't have any performance impact as its all hardware in the monitor/tv. It may increase latency but that should be it.

Avatar image for opusofthemagnum
OpusOfTheMagnum

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@tuxedocruise: most decent TVs are true HDR, monitors are always slow to catch up.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@oursin_360 said:

Hdr shouldn't have any performance impact as its all hardware in the monitor/tv. It may increase latency but that should be it.

Hmm. TVs have various filters that basically stretch 8-bit color to a 10-bit HDR scale. But that is fakery. Ideally you would want a source that was recorded/rendered with the 1000 shades of grey HDR offers. Most people are aware of the fact that hight x width resolution of a display is affecting performance, even though games are drawn in polygonal 3D, it is the 3D to 2D projection or pixel-mapping and pixel post processing that requires a lot of GPU power. The same should be true for depth, which is quite literally the 3rd dimension for a flat monitor. Now every pixel consists of at least 3 color channels so 8-bit x 3, is how we get 24 bit color. Same goes for 30 bit color, excepth HDR isn't just 30 bit color. The thing is - people can't distinguish colors all that well anyway, so HDR merely adds a stretched brightness scale, how this is done is anyone's guess. My hunch is, this happens just by taking 10 bit colors and compressing them a bunch, this way we don't have much it terms of color accuracy but the scale covers more of the very dark and very bright. It still makes sense to render a thing in 30-bit color even if it is compressed afterwards. At the same time HDR really just means more eye-strain at that point.

You'd have to be a game engine designer to explain how games can use HRD, rather then TV-reviewer, which is why Information is sparse.

Avatar image for nuttyjawa
nuttyjawa

41

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I recently chose between upgrading the TV in my games room to a 4k/hdr for my one x or going with a new monitor for my gaming rig.

I went to local TV retailers and I really didn't find HDR jumping out at me, I'm not sure if it's a case of them having terrible blury demos or what, but I ended up going with the monitor as I really didn't see much difference.