The Fraunhofer has a Digital Cinema Alliance that is a kind of “horizontal” grouping of a number of different Fraunhofer institutes (IIS, IDMT, HHI etc) to work together on digital cinema and TV topics. The IIS has worked on MP3, JPEG and other media areas as well as developing test material for the Digital Cinema Initiative (DCI). The IDMT in Thuringa has been working on 3D “object-oriented” audio that it calls “Spatial AV”. This system records the position of different sound sources in 3D space. The system (like Dolby Atmos), then renders the audio according to the kind of speaker or headphone system.
The HHI in Berlin looks at image processing (and has the 3DIC that we reported on in our IFA report). It has been working on UltraHD HEVC and has been developing software for real time encoding and decoding. It has developed multi-screen and adaptive streaming software that it is calling “Fraunhofer Focus”. The technology has been in R&D for some time and we heard that Fraunhofer is “near to product” with this project.
A project that we reported on from the 3DIC event was the Omnicam 360 deg camera system, but our attention was taken by a multi-camera-based lightfield system created by the Fraunhofer IIS in Erlangen.
The system uses multiple low cost cameras in an array to capture a “light field”. The cameras can be of relatively low cost as the smaller sensors in lower cost cameras actually help to create the long depth of field that is needed. Tests have been done on 3 x 5 (demonstrated at IBC), 4 x 4 and 3 x 3 arrays. Dr Frederik Zilly of IIS told us that typically wider arrays are more useful than square or tall arrays as camera movements are usually from side to side, rather than up or down.
The individual images are processed to develop “disparity maps” between the views to create a kind of “cloud” of visual data. This cloud can then be processed by “virtual” cameras which can have arbitrary lens types and can support zooming, focus and motion. Special effects such as simultaneous zoom and motion are supported.
At the show, the Fraunhofer IIS was showing an Avid plug-in to allow the dynamic manipulation of the views.
There are discussions with studios and it is possible that the technology might be used for the capture of live action in 2015. Some “early adopters” are already in discussions about using the technology.
We interviewed Dr. Zilly and there is a video of the interview at https://vimeo.com/106360991
Display Daily Comments
The really fascinating point of the Fraunhofer concept is that if the virtual camera is as good as it looked and turns out to be practical and of high quality, then the same captured content could be used to create both 2D content and also optimised S3D or autostereo content.