A Study of Distance Estimation in Augmented Reality Display Systems

A recent article reports the results of a study to determine the ability of users to estimate distances between real and virtual objects presented by an augmented reality display system.

To put this study into context, I think that it is valid to make an analogy to a topic that may be somewhat more familiar. The colors in a display can be photometrically measured using instruments. Such measured results can be reported using RGB color coordinates. To determine the human response to these colors, the results can be converted to a physiologically-based color space using LUV color coordinates. The analogy and the issue addressed by the researchers in this article is the relationship between measured, physical distances separating real and virtual objects in a stereoscopic augmented reality image and the distances as perceived by users.

The person leading the research team is Chiuhsiang Joe Lin with the Department of Industrial Management at National Taiwan University of Science and Technology (Taipei, Taiwan). The article is entitled “Distance estimation with mixed real and virtual targets in stereoscopic displays.” It is to be published in Displays 36 (2015) 41-48 and can be found available for purchase on-line here.

In this paper, the researchers investigated the accuracy of center-to-center distance perception in augmented reality visual targets viewed by stereoscopic glasses. In their experiments, one real and one virtual target were presented in various configurations within controlled near field space. Distance were determined by “perceptual matching by sketching.”

The result revealed, perhaps unsurprisingly, that the three parameters that dominated in effecting the results were layout, parallax and center-to-center distance. In addition, the effects of the three parameters were found to be interrelated. The summary result was that viewers experience an overall underestimation between targets’ distance with an accuracy of about 84%.

Further details of the results include the following. “Vertical orientation was relatively accurate compared to horizontal, especially when the virtual target is presented at the lower half of the field. In the negative parallax, the accuracy was better for 10 cm from the screen than the 5 cm from the screen and on the screen conditions. This means as the targets presented closer to the observer, the accuracy improved which is consistent with the depth perception in egocentric peripersonal space. It can also be observed that the accuracy improves, as the separation of targets decreases, with the smallest (i.e., 10-20 cm) distance provides accurate judgment.”

Perhaps the most general result of this work is the observation that, if accuracy of distance perception is important, the separation between real and virtual objects should be kept as small and as near to the observer as possible.

The importance of the results derives from the fact that can be used to help decide where to place objects in a scene based on the importance of accuracy in the judgment of their relative positions.

The current study was limited to near field objects, objects within a distance of 100 cm. Further study will be required to address the accuracy problem in “extrapersonal environment.” More generally, further studies are needed to further explore the reasons why greater inaccuracy is apparent in virtual and augmented environments as compared to real world perception. – Arthur Berman

National Taiwan University of Science and Technology, Chiuhsiang Joe Lin, 886-2-2737-6352, [email protected]