Soren Harner is Chief Product Officer of Meta which is at the seed stage after a successful Kickstarter campaign and is based in Redwood Shores in the Bay area. He spoke about how AR can improve our interaction with technology. He believes that there are big behaviour changes coming in the use of displays. The display is a high bandwidth input system to the human nervous system. (I’ve said something almost the same, many times – Man. Ed.)
In 2014, the company was first to market with an Epson Moverio fitted with a 3D scanner on top. The second product, the Meta 2, doesn’t block any part of the user’s face and will start shipping soon. The off axis optics has a 90 degree field of view with a 2560 x 1440 resolution displays and a 720P front facing camera as well as dual mono cameras for depth tracking. This means that the view can be shared with other people.
The company is trying to get rid of all of its monitors and is using its own AR displays in its own offices. There are applications in extending vision beyond the normal visual spectrum in terms of frequency.
To get the field of view (FOV) that is desired, the company would like to be able to get displays with 2,000 ppi. Brighter displays are also desired to compete with ambient light. Higher contrast is also on the wish list – because, if your blacks are not black, you see a ‘cloud’ view of the world. The company also doesn’t like long persistence in the display as this causes smearing as you move. Finally, it is looking for very low latency display systems.
All of this is to try to map the display to fit the human perception system at high bandwidth. There also needs to be exploitation of the human depth and space perception. However, you need to use subtlety to avoid overwhelming the user with information and data, Meta believes.