Is HDR Worth It? – We Take An In Depth Look

HDR has swiftly become gaming’s new hotness — every developer and hardware manufacturer is plastering it all over their latest projects, and tech outlets are screaming its praises from any soapbox they can find. All of this hype, paired with a lack of quantifiable information, has led to HDR becoming more of a point of confusion than a must-have feature for many consumers. To make matters worse, when it comes to monitors, the technology becomes even harder to define concisely.

That’s what we’re aiming to set straight with this write-up — below, you will find all the need-to-know information about HDR, including what it is, what sets it apart from standard displays, and most importantly, whether or not you should have an HDR-compatible monitor in your checkout cart in an hour or so. Let’s get started.

 

HDR: What It Is, And Why It’s So Popular

HDR is shorthand for high-dynamic-range, and it’s quickly become one of the digital media industry’s favorite buzzwords. It’s for good reason, too — unlike many examples of “fad” technology, like 3D TVs, smartphone VR, and augmented reality, HDR’s benefits are both tangible and practical. The technology has been the standard for professional-grade photography hardware for years, and it has recently been adopted by Apple on a remarkably broad scale as well.

HDR changes a lot of things about an image, so much so that it needs to be developed for from multiple angles; the technology needs to be compatible with the media being played, the display, and in the case of PCs, the operating system all at once in order to be displayed correctly. The result, however, is transformative, and below, we’ll explain why.

 

SDR Vs. HDR

You’re probably reading this on an SDR — standard-dynamic-range — display. And that’s why HDR is such a difficult concept to grasp for a lot of people: You can’t actually see it if you aren’t already using an HDR compatible display! In an attempt to remedy this, advertisers will often use image-editing software to manually adjust the contrast ratio and colors to appear as a close approximation of what an image would look like with HDR, but it rarely does the technology justice. To get an idea of what HDR is actually like, head down to a nearby department store — they will almost certainly have a giant 4K TV with HDR running a fancy-looking demo.

If you aren’t able to do so, or you’re just lazy (I’m with ya’), then you can also see HDR in action on newer Apple devices. If you own any Apple product produced within the last two years, then you have an HDR-ready device waiting in the wings.

We’re going to get into this in more detail later on, but the key to HDR’s magic lies in how drastically it affects your display’s lighting, and by extension, its shadows. With better lighting comes better contrast, and better contrast results in colors ‘popping’ more. Deeper shadows and black levels also allow smaller details to be seen more clearly, which when paired with 4K resolutions, produces a truly breathtaking image.

 

Where Is HDR Most Noticeable?

Most HDR demos will be showing off colorful, energetic landscapes and vistas with lots of shadows and deep tones, and for good reason — Scenes like this is where HDR gets to shine. Darker colors and shadow-centric content is where you’re going to see most HDR-specific benefits. When paired with 4K resolutions, the added clarity and level of detail can make alternatives seem drab in comparison.

Video games, in particular, stand to gain a lot from HDR technology. Developers can create darker areas with more intensive shading when utilizing HDR technology. Without SDR, darker areas in video games can appear dull, lifeless, and difficult to navigate; With HDR, darker scenes can be as visually captivating as their more vibrant counterparts.

Vibrant scenes will benefit from HDR as well, of course. HDR has the capacity to produce colors with range and depth that come very close to being photorealistic. Because of this, scenes with warm hues, like sunsets and lush forests, stand to be drastically improved by HDR.

Of course, movies and TV shows stand to gain a lot from HDR as well. Netflix and other streaming services have already begun developing original content with the specific purpose of showing off HDR’s potential for realism and immersion.

 

Which Types of Content Are Compatible with HDR?

Most digital media is already aiming to make HDR a standard rendering methodology. Its Achilles Heel, ironically, lies in just how new the technology is — older media will be unable to support HDR in any capacity. Display manufacturers have already begun working on rectifying this by introducing a sort of emulated HDR that allows for non-compatible media to achieve a picture quality much closer to true HDR than it is to SDR. It isn’t the real thing, however, and as such will be incapable of matching HDR’s visual capabilities by default.

As mentioned, video games are a tailor-made fit for HDR technology. Both Sony and Microsoft have already integrated the tech into their flagship home consoles; The PS4 Pro and Xbox One X support HDR10 natively. Because of this swift integration, home consoles are likely the place where you’re going to enjoy the most consistent HDR experience.

PC gaming, however, is a different story. Windows 10, at the current time of writing, doesn’t support native HDR10. This means that while HDR10 is certainly possible with a Windows PC, it isn’t perfect, and is prone to frequent glitches and visual hiccups. In-game, however, is another matter entirely. Almost every Triple-A game on the market will include a setting for HDR10, which makes integration a quick and easy process.

To put this in simpler terms: HDR10 isn’t fully integrated with Windows but is with video games available on Windows. Once you’ve entered a game, your troubles with compatibility will disappear completely. It also means that you won’t be able to enjoy other kinds of digital media with HDR on your PC with the same sort of ease you would have with a 4K television or entertainment center.

 

How Does HDR Improve Image Quality?

We’ve done enough talking about how good HDR looks — It’s time to break down why it looks so good.

Luminance and Chrominance

Luminance and chrominance are by far the most important factors in determining what is “True HDR” and what isn’t. The luminance is the intensity of the light being emitted from a given surface, quantified per unit. Brightness is the most crucial aspect when it comes to increasing color accuracy, which means that increased luminance will almost always result in a better picture. Chrominance is the colorimetric difference between a color being displayed on a television and that same color in terms of luminance.

A display with greater chrominance will display colors that are more accurate to their actual value in terms of luminance. HDR technology features vastly improved luminance and chrominance when compared to SDR displays, which is why the difference in quality is so striking.

Color Space and Color Gamut

Color spacing is the process used by displays to map out colors in terms of bits. Put simply, better color spacing results in more clearly defined colors. Since the image on your screen is millions of individuals bits of color being combined like a puzzle, if each of those individual bits is higher in quality, then so too will the overall image. HDR displays take this concept to a new level, so much so that it actually has an entirely new term to describe it: Color Gamut. Gamut literally means the “the complete range of scope or something”, which gives you an idea of how vastly improved the HDR color spectrum is when compared to SDR.

Shadows and Black Levels

Black levels are often cited as being HDR’s largest selling point. The concept is notorious for being a problem area with SDR displays, resulting in darker areas appearing as muddled, unclear blue-ish tone. HDR solves this longstanding issue completely, offering black levels and shadows that create staggering improvements in terms of overall picture quality. Well-defined shadows, paired with proper black levels, allow small details and visual information to be displayed with unparalleled clarity.

Nits, Brightness, and Stops

Nits is the term used by monitor manufacturers to measure brightness levels. The more Nits a display has, the brighter it has the potential to become. High Nit values allow for bright colors and hues to be displayed with more accuracy and detail, as is the usual benefit with HDR technology. Monitors need to be careful with Nit values, however — because users will always be seated nearby the screen, a monitor cannot safely produce the same level of brightness as a television.

To counter this, monitor use what are called Stops, which is the name used for the process where screens ‘catch or ‘stop’ brightness from passing all the way through the display. In other words, a monitor needs both a high Nit value and several stops in order to effectively, and safely, employ HDR technology. A monitor requires at least 1000 Nits and 7 Stops in order to be certified as an HDR10 display.

 

HDR Software Requirements: Games, Movies, and More

This section is geared toward PC gamers — if your display is listed as being HDR10 compatible, then there is nothing else you need to worry about in terms of software requirements.

In terms of integrating HDR with Windows, there are a few things unique to the technology. First, HDR, unlike many other kinds of cutting-edge visual enhancements, doesn’t place any strain on your graphics card. That means HDR won’t affect your game’s performance, framerate, or CPU consumption when in use. It does, however, need to be activated within Windows. As mentioned earlier, pairing Windows with HDR can cause some unwanted glitches and visual noise, so it might be best to do so on a game-by-game basis. If you’re set on using the technology with Windows, however, all you need to do is turn it on in the settings menu. That’s it: It’s as easy as changing your wallpaper.

Second, you need to have the most recent generation of GPUs from either of the two major manufacturers — AMD and Nvidia. With AMD, your card needs to be from the RX line; With Nvidia, you will either need a GTX 10 series card, or the most recently released Titan.

 

HDR Formats: What You Need To Know

When it comes to formats, your options are limited. HDR10 has been accepted as the standard, and it will be listed as such on any pieces of hardware sporting it. When it comes to hardware, you will need an IPS panel. TN panels are incapable of employing HDR of any kind, unfortunately. On the plus side, IPS panels are almost universally superior to their TN counterparts. TN panels have the advantage of being able to reach 1ms response times, but even at the highest level of E-Sports, the difference between that and the 4-6ms response times found with IPS panels are almost nonexistent.

 

So — Is HDR Worth It?

Yes. HDR is already becoming the standard across every digital media-focused industry, and as such, you’re going to be missing out on a lot of amazing experiences if you ignore the technology for much longer. That said, there is one reason you may not be too keen on hopping aboard the bandwagon straight away — Pricepoint. You’re going to have a hard time finding an HDR10 display that isn’t also sporting 4K, which will translate to a considerably higher price point than you will find with alternatives. If you’re sorely in need of an upgrade, however, we highly suggest you make the investment. If you spend the money on a non-HDR display, you’re just going to find yourself wishing you upgraded once the gaming industry adopts the tech on a broader scale.

In other words: Do it, you won’t regret it.