subscribe

CES & Eye Tracking

I wrote in Display Monitor recently about this time being the time for a change over from PCs being at the centre of our technology worlds and mobile devices dominating. This will be a continuing challenge for monitor makers, but the business won’t go away. Our unofficial company motto is “Don’t forget the people” and the ergonomic implications of tablet use are important.

Tablets are great for some applications, but the ergonomics are absolutely horrible for long term use and content creation. For those that use PCs as content creation devices, a keyboard is needed with a big high resolution screen at a distance closer to the point of minimum visual effort – around 0.5M to 0.8M typically. That’s too far for the big ‘hot topic’ driven by Windows 8, touch. That has given me a lot to think about over the last couple of years. Our research clients are always asking about touch on monitors. I have been sceptical about the success of touch on the desktop and I couldn’t quite see how gestures would work in the office.

At CES, I tried the Tobii eye tracking solution and have ‘seen the light’, if you’ll pardon the pun. We’ve been keeping an eye (if you’ll excuse another pun) on the technology since I first tried it in March 2007 at CeBIT when the company was in a ‘technology park’ area. I was impressed then, but I was really, really impressed after the CES demonstration. I found it easy and very natural, after just a few seconds of calibration and use, to use the system to select objects, change views and zoom and scroll content. This is, surely, the future of interaction with desktop monitors?

The key to the interaction is to use the eyes to get an approximate eye position and then use either the keyboard, a track pad or some other input to generate an action. The Windows 8 tiles seem to work very well indeed in allowing this kind of operation (although look at this devastating critique on the ergonomics of Windows 8). I can imagine, actually, that the ideal combination is eye-tracking with foot operation. This would be very similar to driving a car, with the hands pointing us in a direction that is set by where our eyes look, while control is achieved through the feet. I really want to drive my PC!

However, one of the great points about driving a car is that the basic controls of accelerators, brakes and steering are standardised. The standards for these controls evolved during the 1920s with the mass adoption of cars. There was no legislation, or a standard from a standards body – the standard evolved from the industry. For eye-tracking to evolve to take over in monitors as I think it should, then standards for the other elements of control need to emerge. I suspect that this will be based on evolution in the industry and the emergence of a de-facto standard rather than a formalised process.

Another big item for me this year at CES was the new Power Delivery standard that is being finalised for USB. I started writing about USB in March 1995, when it was announced at the Windows Hardware Conference (WinHec). USB has been unbelievably successful and from its original purpose as being for peripherals and telephony, it is becoming a significant standard for power delivery, in mobile phone chargers, in the development of mains adaptors for the wall and even in supplying power in aircraft (the Virgin flight I took to CES had USB outlets that I used to power my MP3 player). The next development is to allow up to 100W for notebook/tablet powering and charging.

So, I see the near future of desktop monitors being as the power source and dock for a tablet, connected via a USB cable and perhaps with the graphics going across USB 3.0. That would be good news for DisplayLink!