IFA was, well, IFA. There was not much that was radically new or different this year, apart from the huge new CityCube hall that Samsung occupied. In terms of technology, there was little that was unexpected, but the show was well attended. Highlights or key points from the show were that
* OLED is almost dead in the water for TV
* Samsung is determined to drive curved LCD TV in an attempt to distance itself further from its rivals (although I remain unconvinced)
* Wearables are the “product du jour” for all CE brands, but nobody yet has the “killer product” (and the threat of an Apple launch at the end of the last day of the show was the proverbial “elephant in the room”)
* HDR and wide colour gamut will be attractive for viewers if content and delivery can be managed
* UltraHD is pretty well “job done” for large LCD TVs
* Intel is keen to seriously fight back in mobile devices
I’m sure there are more things we learned (I think so, as there are 20,000 or so words that we have put together!), but I wanted to use the space between IFA and IBC to look at the bigger TV picture.
Over the last year, there has been a lot of talk about “better pixels” not just “more pixels” in TV. Three key technologies really constitute what everyone means by better pixels:
* High Dynamic Range (HDR – more extreme contrast between light and dark on the screen)
* Wide Colour Gamuts (WCG – so that movie content is better reproduced and images of the world are more lifelike)
* Higher Frame Rates (HR – for better motion imaging).
and of course, UltraHD (3840 x 2160 or beyond) is the “more pixels” component.
Each of these new technologies adds cost, content complication and uses more bandwidth, so there has been a lot of discussion about which is important and why, and you can hear the “axes grinding”. For example, the QD guys acknowledge HDR & HR, but believe that colour is the key thing (and might even be threatened by HDR if backlight intensities go up or, in the case of QD Vision, if everybody jumps to direct backlighting to support HDR as their technology works best on edge-lit displays.
At the conclusion of IFA, I have had a bit of a ponder about which of the features is going to be widely adopted and why. After some thought, I realised that there is good news and that is that all of them will be adopted! Nobody cares about all of them, but there is a significant technical and economic grouping that cares about each. So, who cares?
UltraHD is a Done Deal
Let’s get the easy one out of the way. As Fabrice Estornel of Panasonic said at the DisplaySearch conference on Monday, “UltraHD is done for large sized LCD”. Effectively, maybe by next year, you simply won’t be able to buy a big consumer TV (say 55″ and up) from an A brand that is not UltraHD. Of course for low cost B & C brands and professional applications such as digital signage, FullHD will continue. Some A brand “fighter” models might also be marketed, but not with any conviction. The reality is that, for the LCD makers, making UltraHD panels is relatively easy compared to making mobile and tablet panels and it will increase ASPs, which panel makers desperately want and need. So, UltraHD is a done deal.
Better Colour and HDR
So what about the others? First let’s consider HDR and gamut together. These have tremendous advantages for owners of movie content. There is a lot of content from movie studios that is available in archives with a DCI P3 or wider gamut for better colours, and Hollywood movies with their fantastic special effects and wide dynamic range can look spectacular when properly re-processed for HDR and wide gamut (although the jury is still out on which technology will be used to support them).
The key point is that studios will want to resell their archives on the next Blu-ray format, or via VOD through Netflix and Amazon with HDR and WCG. At the DisplaySearch conference, Joe Kane, the veteran visual quality guru, said that although studios have used DCI 4K in production and in processing, they haven’t, typically, archived it. The archives are usually in 1080P. This content can be re-graded for HDR and WCG and then upscaled by TVs and will look fantastic – better than it did when you first saw it at the cinema. (And if you are sceptical, try to get along to IBC to see Dolby’s HDR demos. I’m sure there will be others, but I’ve been watching Dolby’s demos for around seven years and they are compelling.)
The set makers will also like WCG and HDR because they will make sets look much better in store. In the past, in the days of CRT, there was a big difference between the best and worst sets. However, with LCDs all having much the same brightness and colour, the visual differences are less obvious to consumers in store. With HDR and WCG, there will be a big difference between the best and the worst again, so you can expect set makers to be strongly behind this. More differentiation means more margin opportunities and there are real cost and technical differences between the good ones and the rest. Users will understand much better the advantages than they did with, for example, high frame rates for LCDs.
Furthermore, WCG (probably using quantum dots) allows the LCD makers to compete with OLED in colour terms. Even better for them, in the high ambient light of a TV store, an LCD TV with something between 700 cd/m