The sad, misleading, and embarrassing state of HDR in PC gaming

HDR is ridiculously bad for PC gaming in most cases, and we all know that to be true.

But that might surprise you if you only consider how gaming monitors are advertised. After all, on paper, HDR is technically supported by your monitor, games, and graphics card. Heck, even Windows supports HDR relatively smoothly these days.

So who’s to blame then? Well, when I dug deep for an answer, I found three main culprits that explain our current situation. Even with light at the end of the tunnel, this multifaceted problem will not go away on its own.

game problem

I need to start by laying the groundwork for why HDR is a problem in PC gaming in particular. It’s a highly variable experience depending on what type of screen you have and what game you’re playing, making this whole HDR mess even more confusing on PC. A big reason why is static metadata.

Jacob Roach / Digital Trends

There are three main HDR standards: HDR10, HDR10+, and Dolby Vision. The latter two support dynamic metadata, which basically means they can dynamically input display information based on what the monitor is capable of and the scene it’s in (even the frame that’s currently on screen). HDR10, on the other hand, only has static metadata.

Dynamic metadata is one of the main reasons console HDR is so much better than PC HDR.

Only a select few monitors support Dolby Vision, such as Apple’s Pro Display XDR, and none of them is a gaming monitor. There are a few HDR10+ monitors out there, but they’re exclusively Samsung’s most expensive displays. The vast majority of monitors work with static metadata. However, TVs and consoles heavily support Dolby Vision, which is one of the main reasons console HDR is so much better than PC HDR.

As former game developer and product manager for Dolby Vision Gaming Alexander Mejia points out, static metadata creates a huge problem for game developers: “There are more HDR TVs, monitors and laptops on the market than ever before, but if you buy a pair at your local big box retailer, your game will look drastically different on each one… How do you know the look you set up in your studio will be the same as what the gamer sees?”

See also  OTT Navigator IPTV MOD APK (Premium unlocked) 1.6.9.1

HDR comparison in Devil May Cry 5.Jacob Roach / Digital Trends

On my Samsung Odyssey G7, for example, Tina Tiny’s Wonderland looks dark and unnatural with HDR on, but devil may cry 5 looks naturally lively. Do a search for user experiences on these two games and you’ll find reports ranging from best HDR gameplay to absolutely terrible image quality.

It doesn’t help that HDR is generally an afterthought for game developers. Mejia writes that developers “have yet to deliver a standard version of their game’s dynamic range, and creating a separate version for HDR means double the amount of mastering, testing, and QA. Good luck signing that.”

There are numerous examples of developer apathy towards HDR. recently posted elden ring, for example, it shows terrible judder in complex scenes with HDR and motion blur enabled (above). Turn off HDR and the problem will go away (even if motion blur is still on). And in fate 2, HDR calibration is broken due to four years. HDTVTest discovered that the slider was not assigning brightness correctly in 2018. The issue was not resolved until February 2022 with the release of the witch queen expansion.

Gaming is the source of the problem for PC HDR, but it’s a consequential problem: a problem stemming from a gaming monitor market that seems frozen in time.

monitor problem

Alienware QD-OLED monitor in front of the window.

Even with the many Windows bugs that HDR has caused in recent years, monitors are the main source of HDR problems. Anyone familiar with display technology can point out the problems without thinking, and that’s the point: After years of HDR monitors flooding the market, displays are largely in the same place they were when HDR first came to Windows.

See also  Martin Logan Motion LX16 Review

Conventional wisdom is that good HDR requires at least 1000 nits of maximum brightness, which is only partially true. Brighter screens help, but only because they can encourage higher contrast levels. For example, the Samsung Odyssey Neo G9 has twice the maximum brightness of the Alienware 34 QD-OLED, but the Alienware display offers much better HDR due to its exponentially higher contrast ratio.

There are three things a display needs to achieve good HDR performance:

  1. High contrast ratio (10,000:1 or more)
  2. Dynamic HDR metadata
  3. Extended color gamut (more than 100% of sRGB)

TVs like the LG C2 OLED are so desirable for console gaming because OLED panels provide a great contrast ratio (1,000,000:1 or higher). Most LED monitors max out at 3000:1, which isn’t enough for solid HDR. Instead, monitors use local dimming, independently controlling the light in specific parts of the screen, to increase contrast.

A colorful image on the OLED screen of the LG C2.Dan Baker/Digital Trends

However, even high-end gaming monitors ($800+) don’t come with enough zones. The LG 27GP950-B only has 16, while the Samsung Odyssey G7 has an embarrassing eight. To get really high contrast ratio you need a lot more zones, like the Asus ROG Swift PG32UQX with over 1000 local dimming zones, a monitor that costs more than building a new PC.

The vast majority of HDR monitors do not even reach the minimum. At Newegg, for example, 502 of the 671 HDR gaming monitors currently available only meet VESA DisplayHDR 400 certification, which does not require local dimming, extended color gamut, or dynamic metadata.

An example of local dimming on a Vizio TV.

Spending on the best experience isn’t new, but it’s been that way for four years. Instead of high-end features going mainstream, the market is awash with monitors that can advertise HDR without offering any of the features that make HDR important in the first place. And sub-$1,000 frame-check monitors usually protect themselves to do so with some local dimming zones and poor color coverage.

There are exceptions, such as the Asus ROG Swift PG27UQ, which offers an excellent HDR gaming experience. But the point is that the vast majority of monitors available today aren’t much different from the monitors available four years ago, at least in terms of HDR.

See also  The most common Google Meet problems and how to fix them

Light at the end of the tunnel

Ultra-wide, curved QD-OLED monitor.

The PC HDR experience has been largely static for four years, but that’s changing thanks to a fancy new display technology: QD-OLED. As the Alienware 34 QD-OLED shows, this is he panel technology that will really boost HDR in PC gaming. And good news for gamers, they won’t have to spend less than $2,500 to get their hands on it.

MSI has just announced its first QD-OLED monitor with identical specs to Alienware, and I assume it will use the exact same panel. If that’s the case, we should see a wave of 21:9 QD-OLED monitors early next year.

We’re also seeing more OLED monitors, like the recently announced 48-inch LG 48GQ900. Sure, these are TVs marketed as gaming monitors, but display manufacturers are clearly in tune with gamers’ demand for OLED panels. Let’s hope to see one that comes in the size of a real monitor.

There are other display technologies that promote better HDR performance, such as mini LEDs. But QD-OLED is a seismic shift that will hopefully finally make HDR a reality for PC gaming.

editor’s recommendations

Categories: GAMING
Source: tiengtrunghaato.edu.vn

Rate this post

Leave a Comment