Rockwell Collins showed the Coalescence system that Chris looked at the I/TSTEC show (Rockwell Collins Shows Multiple AR/MR Demos). This is a mixed reality headset made from an Oculus Rift with a camera for each eye (although any headset can be used). The user sits in a blue screen area with physical controls. In the headset, the image is of the physical controls, and images of the user’s hands and arms are superimposed on top of a virtual scene. That means that the haptic feeling of control operation is combined with the visual stimuli, which is better for training. Further, it makes it easy for a trainer to physically demonstrate correct technique.
Resolution is a challenge with current headsets at just 20/70 vision, where as for level D simulation, this needs to be 20/40, so much higher resolution headsets are needed.
The system needs a blue (or green-screen) dome to be assembled, but this is fairly simple. The space taken is small so it can be used in limited spaces, for example on board ships.
As well as supporting day time operation, the use of a VR headset makes night operations easier to replicate.
Analyst Comment
I was surprised how good the effect of the mixed reality was. There was a small amount of latency, but not an objectionable amount, and it was fast enough for early applications which are maritime. The system can run with a software merging of the virtual and video data, which means latency of 30+ms, or a hardware combiner can be used that reduces this to 25ms, but costs more.
However, the position of the physical controls was moving in the virtual environment as I moved my head, which was disconcerting and interrupted the ‘suspension of disbelief’ needed. The company told us that this is because of the relatively deep cameras used. The next version will probably use mirrors on the surface of the Rift, together with differently mounted cameras to reduce the effect, although, of course, this will still not put the camera capture right where the eye would be.