subscribe

How to Sell High Dynamic Range or Not

The human visual system has evolved to detect an immense range of brightness in our environment, from the dimness of starlight to the bright intensity of sunlight. This is achieved through a complex adaptation system. High dynamic range (HDR) is a technology that enables displays to produce a broader and more granular range of brightness levels, colors, and contrasts compared to standard dynamic range (SDR) displays. HDR achieves this by leveraging source content that contains more data about brightness and color than standard content. When viewed on an HDR-compatible display, this content can showcase brighter highlights, more detailed shadows, and a wider spectrum of colors, thus providing a more lifelike and immersive viewing experience. HDR technology is often characterized by displays that can achieve higher peak brightness levels (measured in nits) and support wider color gamuts, such as BT.2020, along with greater bit depths, like 10 bit or 12 bit.

That sounds like a display pitch, but does it really have much meaning as a comparison of two TVs or two gaming monitors? Nope. It’s just the display industry navel gazing at and talking to itself.

In general, brightness levels are typically measured using the unit cd/m2 (candelas per square meter), which is also known as nits. This metric quantifies the luminance of a display, which is the amount of light emitted per unit area. In psychology, there is a term, just noticeable difference (JND), which means the smallest difference in intensity or stimulus that a person is capable of feeling or understanding. It is used in audio engineering, and it is used in display technology. In theory, by understanding how many JND levels a display can produce, developers can ensure that the display covers the full range of perceivable brightness steps without unnecessary redundancy.

Put those two things together and you kind of arrive at a subjective understanding of brightness, which means that you know there is a scientific, quantifiable measure, nits, and you know that there is a scientific, subjective measure of what the user may, or may not, perceive, JND. So, there’s your first disconnect with real people.

Next, we move on to the notion that maybe if you just have the highest number of nits, then that means you have the brightest displays. Wouldn’t that be nice.

In recent years, advancements in display technology have led to TVs with higher peak luminance values, and some premium models indeed claim to reach 3000 nits, 5000 nits, or even more. These high levels of brightness are especially touted in the context of HDR content, which, in and of itself, is a subjective solution because the display is driven by the content, and the content may have a language with which to speak to the display, but who knows how eloquent the content can truly be.

Nevertheless, TVs are still marketed with the notion that the primary advantage of having higher peak luminance (more nits) is the ability to better showcase HDR content. HDR content contains more information about brightness and contrast than standard content. A TV that can achieve higher nits can display brighter highlights and more detail in bright areas of the image, making HDR content look more dynamic and realistic.

High nit values, especially when combined with good black levels (as seen in OLED TVs), can lead to better contrast ratios, which is the difference between the brightest white and the darkest black that the TV can produce. A higher peak brightness can make a TV more suitable for viewing in bright rooms or during daylight, as the screen can remain visible and vibrant even in the presence of ambient light.

So, higher nit values should be a selling point for manufacturers, as consumers may perceive TVs with higher brightness levels as superior, especially when comparing specifications side by side. However, while peak brightness is an important factor, it’s just one aspect of picture quality. Other factors, such as contrast ratio, color accuracy, and black levels, are equally, if not more, important for an overall excellent viewing experience.

It’s also worth noting that while a TV might be capable of reaching such high nit values, it doesn’t mean it will always operate at that level. Often, these peak brightness levels are achieved in specific modes or situations (like when displaying a small bright object against a dark background). In many real-world viewing scenarios, the average brightness might be lower.

So, you’re pretty much left with some interesting theoretical reasoning for your display’s performance, very subjective means of figuring out how the display compares to other displays, and very subjective means of figuring out whether the display actually fits you, your space, your content preferences, and your viewing space.

HDR is a bit of a mess as a measure of anything, although it may be ubiquitous. A lot depends on the combination of hardware and software in your TV. There are competing HDR standards, which could be discussed ad nauseam (HDR10, Dolby Vision, HLG, etc.). There is also the issue of the totality of HDR performance—not just brightness, but color accuracy and color palette. It all adds to an actual viewing experience that may be too intense or too vivid for some people.

Probably why the best thing to do is to just dump a giant warehouse of TVs in front of people, have them all running the same content, and just let people look at the prices and judge the quality with their own naked eyes. Because, you probably can’t convince people on specs alone, and the specs are not reliable in purely objective terms anyhow. Maybe HDR isn’t any good as a measure of anything meaningful to the people who don’t work in the display industry—you know, the ones who have to pay for the extra hardware and software. It’s a bit elitist. Gamut rings, anyone?