subscribe

Are SDR TVs Superior to HDR TVs in Bright Environments?

HDTVTEST logo

Some recent testing of 4K HDR TVs by HDTVTest suggests that as the ambient light in the room increases, the benefit of the HDR technology diminishes rapidly (http://tinyurl.com/hnlfmwr). Reviewers came to this conclusion after comparing 2K and 4K Blu-ray content on 4K TVs calibrated for SDR and HDR with the room lights on.

Specifically, they set up a Samsung KS9000 (fed by an OPPO BDP-103 BD deck) beside a Sony XD93 (fed from a Samsung UBD-K8500 Ultra HD Blu-ray player). Both TVs were calibrated for SDR and HDR (D65 white point and their respective EOTF), and periodically they would swap sources to make sure the difference they were seeing was not caused by the displays (it wasn’t).

They played the same UHD and normal Blu-ray discs in the two players and paused at the same frame. on both players. Leaving the backlight and contrast untouched (at their maximum values) on the television showing 4K HDR, they then adjusted the backlight and contrast settings on the TV showing 1080p SDR to match the average brightness and contrast of the HDR TV. They then switched inputs allowing a signal generator to deliver a peak white signal to measure the luminance of the adjusted settings on the SDR TV.

The team analyzed five 4K BD films obtaining a similar result for all five. They found that they needed to adjust the brightness of the SDR mode in a range of 120-220 nits to match the average picture level (APL) of the TV in HDR mode, which was set to maximum brightness.

The problem they found was that as you now increase the light in the room, it competes with the light from the TV. This essentially raises the black level and decreases contrast. This can be overcome to a degree by increasing the luminance of the display.

Now here’s the rub. There was still more headroom on the SDR mode TV that allowed for more brightness, but the HDR mode TV was already at its maximum brightness. As a result, the detail that was visible in the dark regions of the HDR mode TV when the room was dim, was no longer visible. But by raising the light output of the SDR mode TV, there were actually more visible details in the darker regions.

In HDR mode, the TV is forced to be at maximum luminance setting and content is played back at the exact level it was mastered to. If a part of a scene was mastered at 50 nits by the colorist, it will play back at 50 nits on the TV in HDR mode. There is no way to make this brighter when in HDR mode.

HDTV Test ImageHDTV Test Image – click on the image to go to the original

The other complication is that mastering environments are supposed to be dim (maybe as dim as 5 nits), which means the viewing environment should also be dim to get the right effect. But as you can see, lots of content is not viewed in a dim environment so the HDR mode may actually look worse than an SDR mode display with the backlight pumped up. That’s a really troubling conclusion.

The bottom line is that HDR content is mastered in a very dim room and is best viewed in the same environment. Once you turn the lights on, the advantages of HDR can quickly diminish. – CC

Analyst Comment

The effect of the ambient light on contrast and dynamic range is not a new one. Ray Soneira from DisplayMate has highlighted this especially in mobile devices and the projection community has talked a lot about the effects of light from around the auditorium (for example from emergency signs) and its effect on HDR. However, this is the first time I have heard of this issue in TV, although the logic is sound. Of course, display measurement is almost always performed in black rooms and few ever talk about performance in higher ambient light.

The other significant question from this is whether retailers will be able to show the clear difference between SDR & HDR sets in store. (BR).