Something went wrong. Try again later

GundamGuru

This user has not updated recently.

786 391 0 0
Forum Posts Wiki Points Following Followers

What is 4K HDR?

There's been much talk lately on the 4K HDR capabilities of Sony's new PS4 Pro and slim, and whether or not you will need a new television to go with your new console. My goal with this, my first blog, is to cut through the marketing jargon and set straight what this new HDR is and how it relates to 4K, your TV, and your console.

HDR - High Dynamic Range

Many photography enthusiasts and PC gamers are already familiar with the term HDR. It's been an option in both high-end cameras and PC games for years. Why is it new in TVs now?

Half-Life 2 Lost Coast from 2004
Half-Life 2 Lost Coast from 2004

The New Technology

Comparison of HD, HDR10, and Dolby Vision color spaces
Comparison of HD, HDR10, and Dolby Vision color spaces

The new TV technology isn't HDR in this traditional sense. What it is is a set of standards called HDR10 that defines a minimum contrast ratio for your TV, requires the TV accept 10-bit color (with a new color space), and have 4K resolution. As I'm sure video guys like @drewbert and @vinny will know, the regular old High Definition standard has an associated color space called Rec. 709 that defines the range of colors. Color data in HD is typically encoded at 8-bit per channel, but it's interesting to note that the HD standard actually optionally allows for 10-bit color. Old HDTVs supporting 10-bit color standards were known as xvYCC or Deep Color depending on the brand, and the PS3 actually had support for Sony's xvYCC. The additional colors are mostly in the red and green ends of the spectrum, so that will be where you notice the biggest difference.

So, then, the new HDR10 spec just makes mandatory a technology we've already had, but also sets a requirement for a minimum black and white level. Speaking of which, the white level of your TV is just a measurement of how bright it can get in cd/m2 (commonly referred to as nits). The average smartphone and computer monitor gets into the mid 300s, but the HDR10 standard is targeting 1000 nits as a minimum and allows a max white level of up to 4000 nits. Black levels are measured the same way, but you want them as low as possible, with a good LCD HDTV getting <0.05 nits. Contrast ratio, then, is just an expression of the difference between these two numbers (300 / 0.05 = 6000). This is that "pop" you'll hear a lot of people talk about when they see HDR in person, and if you're coming from a particularly poor TV, or an older technology like CCFL-backlit panels, then you'll really notice the difference in contrast and total brightness.

Finally, it's important to emphasize that while having 4K support doesn't require or imply HDR10, the reverse is true. You're not going to find HDR compliant TVs that aren't 4K. There's also a standard out there called Dolby Vision. It is essentially the same thing as HDR10, but sets a higher bar and requires it's own TVs that conform to Dolby's tests.

What is the bottom line?

HDR10 is really all about what your TV is guaranteed to be able to do, and having your consoles and devices support HDR10 just means that they can speak the same language if those higher capabilities are available. You'll see a higher overall brightness, darker blacks, and more shades of color. Neither effect requires much more processing power from your device, which is why Sony can just patch it into all the shipped PS4s. Bottom line, when you get a new TV you will want support for the HRD10 standard, but is it worth throwing out your existing 1080p set? That's for you to decide.

9 Comments