subscribe

NAB Provides an Eye-full (and an ear full)…

I recently attended the Future of Cinema Conference and NAB to learn more about UHD, HDR, WCG, VR and more. And did I get an eye and ear full….

NAB LOGOOne of the highlights was the Lytro Cinema camera. This is a monster at nearly 7 feet long (>2m) that can capture 755 megapixels at 120 fps. But being a light field camera, it is also capturing the angle of light at each pixel allowing a post production process that can change the focus, depth of field, point of view, frame rate, shutter angle and more.

The light field data can create accurate depth information so stereoscopic images can be created too. It is essentially creating a 3D model of the live action scene. This is profound as it means the CG 3D VFX model can be perfectly merged with the live action 3D video model to create images that have not been possible before.

The team even showed a short video made to highlight what the camera, post production tools and cloud-based services can offer to potential content creators. This IS the future of professional content creation, so it was exciting to witness this.

Also at the Future of Cinema Conference was the screening of an 11 minute clip from Ang Lee’s new film called “Billy Lynn’s Long Halftime Walk.” This was shot and projected in 3D, 4K, WCG and 120 fps. This not a cinematic look, but it may also be the most impressive 3D you will have seen to date. There is no ghosting, the images are crisp, clear and lifelike. There are Iraq battle scenes that are so lifelike, it was a harrowing experience. The film is due in November, but I suspect only a few theaters will actually be able to screen it in all its 120 fps glory.

At NAB, the UHD Forum released a new guidelines document that is focused on how to deploy UHD and HDR solutions in 2016. This is an evaluation of production and infrastructure issues that focused a lot on the High Dynamic Range (HDR) options such as Dolby Vision, Technicolor/Philips, HDR 10 and Hybrid Log Gamma (HLG).

The result is a recommendation to focus on non-metadata-based solutions like HLG10 and PQ10. These are 10-bit implementations, and in the case of PQ10, means HDR10 without metadata (just the SMPTE 2084 standard EOTF).

In the meantime, metadata-based solutions need to find good answers for how to handle video switching, encode/decode steps, ad insertion and other issues before they can be recommended. On the other hand, Technicolor was demonstrating an effective live HDR channel doing all of these things, so the debate will continue.

Also important was the first public demonstration of a side-by-side demo of static metadata vs. dynamic metadata. This was done by Samsung and showed that many scenes, especially darker ones, will benefit from the ability to optimize the image using the dynamic metadata. SMPTE ST-2094 is the standard for dynamic metadata and is now in balloting, I am told.

I also asked many companies in the ecosystem about their plans for HDR. All are working on it, some moving ahead and others waiting for standards to develop or consensus to emerge. It feels little bit like the rush to 3D 4-5 years ago, but I think with more deliberation on how and when to jump in. That’s a good thing.

In the VR world, there were lots of demo and exhibits. Of note, perhaps, is the ability to now live stream a 360-degree VR image in a 4K video frame complete with stitching directly to a VR headset. These headsets still don’t have the image quality and the content is still not good enough in my opinion, but the technology is moving ahead in leaps and bounds, so it is only a matter of time.

And one must not ignore the big coming out party for ATSC 3.0. This is a very well thought out solution based on IP transport that allows for over the air broadcast and broadband delivery. It should be good for a couple of decades and is about to start formal approval by the FCC with trials expected shortly as well.

And there is tons more in this special report.