subscribe

Better Calibration to Eliminate Headaches

So my colleagues asked me to ensure the fisheye video looks good in our client-side SDK. While a sound calibration is always important, it is even more important when it comes to stereoscopic content. When the calibration is even a little off, the stereo won’t work, or it will look odd. Your eyes may have trouble adjusting. Even worse, it might give the user a splitting headache. And believe me, having looked at many badly calibrated videos, I have first-hand experience.

I used to think that calibration was an art, and I have painstakingly been working with videos on my screen and trying to find straight lines in them, measuring these, doing math on the sensor density, etc. etc. More art than science, really. Because we only worked with a handful of combinations, doing the calibration manually was not a problem. But recently, we have been getting more requests and for more combinations, so we needed something less time and headache consuming.

calibration

OCamLib to the Rescue

Last week, I stumbled upon an interesting tool while reading about Intel’s XCam open source stitching software: a nifty tool called OCamCalib.

Provided by Davide Scaramuzza of the ETH in Zürich, this “Omnidirectional Camera Calibration Toolbox for Matlab” does something that is nothing short of amazing. Print a chessboard, move it around in front of the camera you want to calibrate, and record that in a video. Next, drop the video in OCamCalib and out come all the parameters we need for a polynomial fisheye camera model. There can be quite a few parameters, and to keep things manageable we limit the amount to 39 (yes, thirty-nine).

I did learn that this requires clean video feeds, without any “correction” plugins used in post. One such video had me quite busy for a day, trying to understand why I kept getting strangely looking results. But then I understood that some 3D processing had been applied, and when I got the raw camera output and it worked like a charm.

Useful in ClearVR Cloud and the ClearVR SDK

Armed with the new toolbox, we are now re-calibrating our fisheye settings. They were already quite good, but now we can make them even better. This is important as ClearVR maps fisheye videos in two elements in our end-to-end chain.

First, we ingest fisheye video into ClearVR Cloud and convert it to a 180° cubemap before doing a tiled transcode. We do this for the live matches in Sky Worlds, for instance. Second, our client-side SDK is able to take non-tiled 180° fisheye video and apply the same mapping.

We also confirmed something that is well known in the industry: stereoscopic video in VR should avoid very close objects. Such objects give users a hard time adjusting their eyes and they elicit the vergence-accommodation conflict. They also make the lack of true parallax in stereoscopic 3 DoF video even more apparent – and more annoying.

As always, the proof of the pudding will be in the eating, but I believe this will save sports fans some serious headaches this summer, and make their VR sports experience all the more enjoyable.

Curious? Drop me a note at xavier at tiledmedia.com!

(And thanks Justin Fiksel for the picture! )

Tiledmedia is a creator of technology for low-latency delivery of extremely high resolution content especially for 360 degree virtual reality or 180 degree panoramic video.

This article originally appeared on the Tiledmedia blog and is reproduced with kind permission.