Forget 5G. This Is the Technology Apple Is Really Betting On With the iPhone 12.

What They Say

An article on investment site ‘The Motley Fool’ suggests that the inclusion of Lidar in the iPhone 12 Pro is a bigger deal than many have recognised. The technology can create an accurate depth field that can be used to manipulate images and significantly improves the performance of augmented reality applications. The technology was added to the iPad Pro in the spring and can detect objects and surfaces up to 5m.

What We Think

The inclusion of some kind of depth sensing is not a real surprise, although it has not been easy to implement it cheaply and with low power, so far. Intel’s RealSense system was very interesting for display makers as an add-on, but is expensive and power hungry. I was not convinced about the value of the technology until I got a demonstration of a Dell tablet with the sensor built-in at CES a few years ago. That convinced me that I really wanted it on my camera. Simulating the shallow depth of field of large lenses and big camera sensors has become surprisingly effective, but if you want to do it properly and completely cleanly, you need good depth data. And, of course, it’s critical for AR and Mixed Reality.

Long term readers will know that I regard the development and capability of sensors as a big driver for the future functions of more sophisticated displays. (BR)

Ipad Pro lidar procThe Lidar sensor on the iPad Pro