CNN made waves on Tuesday night by incorporating "three-dimensional holograms" into its coverage of the U.S. election. The only problem is, they were not really holograms.
Insight Media Consultant
At about 7 p.m. EST, reporter Jessica Yellin, who was in Chicago, spoke with New York-based anchor Wolf Blitzer live "via hologram," CNN said. Yellin appeared somewhat fuzzy and her image, apparently projected a few feet in front of Blitzer, appeared to glow around the edges.
Yellin explained that her image was being filmed using three-dimensional imaging technology produced by Vizrt Ltd. (Bergen Norway, www.vizrt.com) and SportVu Ltd. (Kfar-Sava, Israel, www.sportvu.com). The hologram interview was billed as a first for television. CNN also aired a second hologram interview between anchor Anderson Cooper and rapper Will.I.Am, who was also in Chicago.
A video is available on line that includes the hologram interviews from the CNN broadcast. It can be found at URL:
http://edition.cnn.com/video/?/video/tech/2008/11/05/hologram.natpkg.behind.the.scenes.cnn. The interview also includes materials describing and illustrating how the holograms were produced. Here is a quick summary.
First of all, the images can more accurately be described as tomograms, which are images that are captured from all sides, reconstructed by computers, then displayed on screen. While viewers saw two figures talking to each other, the "holographic image" was actually added to the image like a special effect.
CNN’s Election Center in New York was equipped with 6 tracking cameras. The tracking technology used a combination of Shotoku’s dollies and Thoma’s handheld Walkfinders. The tracking allows the company’s graphic engines to know where each camera is located and very precisely the direction in which it pointed. Thousands of miles away, two special studios were erected, each containing a semi-circular green rig, covering 220 degrees and more than 40 HD cameras positioned along the wall. Each camera captured a specific angle of the person.
The tracking data from the cameras in the CNN Election Center is processed by Viz Virtual Studio software. More specifically, the Viz IO, a studio configuration and calibration tool, collects all data and converts the camera position and focus into 3D coordinates. The tracking and camera position is then instantly transmitted to the remote location, the so-called "transporter room".
In the room, the tracking information is processed to identify the view angle required, that is, a sub selection of 2 of the 40 cameras to use. From the 2 images captured, a Viz Engine plug-in, specially developed by SportVU, created a full 3D representation of the person in the rig. The images are blended and the color is corrected and used as a texture. The 3D model is rendered and textured with the video signal in Viz Engine and sent back to the studio as a full HD video signal.
Back in the studio, the local and remote images are assembled and some special effects added (including the aura) to give the full impression of the holographic interview. All of this is completed in fractions of a second to allow a live playout.
Can we expect to see more of this type holographic technology used in TV programs? Maybe so. This first broadcast certainly had novelty. Is that enough? Stay tuned for developments.