Multitouch user interfaces have gained considerable attention with the début of the Apple iPhone and Microsoft Surface products. Although multitouch screens may well prove to be an even more versatile type of input device than keypads, they are not without problems. This is in part due to the fact that the user’s fingers are as much a hindrance as they are a tool. The more often the user touches the screen, the more often the content is occluded. The problem gets worse as the touch screen gets smaller since the user’s fingers get larger in proportion to the size of the content.
Insight Media Consultant
Microsoft, Mitsubishi, and the University of Toronto are collaborating on a solution to this problem that they have named LucidTouch. It is a mobile platform with a double-sided touch screen system. Users can interact with the 7", 800×480 pixel screen of the prototype by the conventional means of the front surface through a resistive touch screen. In addition, users can choose to manipulate objects by touching a sensor pad on the back of the device.
Microsoft has not indicated what product applications might use the new technology. Nonetheless, its form factor suggests it could be used in an ultra-mobile PC or PDA. Photos of a prototype appearing on a Microsoft researcher’s Web site included those of the device being used as a GPS unit and a gaming platform.
The photos also suggest that some prototypes are being co-developed by Hewlett-Packard’s Compaq division, which has worked with Microsoft in the past on Tablet PCs.
It is not apparent when LucidTouch will hit the mainstream market. However, when and if it does, it will likely be used in larger tablet PCs and be most useful to users that rely heavily on screen real estate.
The key feature to the new interface is what researchers call "pseudo-transparency". It >creates an on-screen silhouette of the fingers wrapped around the back of the device. In this way the user can see the location of their fingers in back of the display as if it was a transparent display. This eases the difficulty of navigation, acquiring targets and manipulating content. With the prototype, users are able to type text, resize images, navigate around any graphical user interface, click hyperlinks and negotiate maps from both the front and back of the portable device.
In addition to the panel itself, the designers have implemented software to assist the user in the use of the touch system. This includes a dragging system which allows the user to "pass off" an object from one finger to another and a customized keyboard layout that permits touch-typing at a new and somewhat awkward angle. Additionally, the software adds pointers to the finger silhouettes so that the user can precisely select targets on the touch pad that might be smaller than their finger.
Even with the availability of only an early prototype, researchers were still able to write applications and gather user responses from a small group of users. For example, most preferred to type on a QWERTY keypad using the front of the screen. Half of the participants preferred using the back of the device for tasks such as dragging objects and navigating maps. The users were also divided on whether the superimposed images of their fingers were helpful. Two-thirds of the participants preferred the superimposed images when using the keyboard and dragging objects and half preferred them while using the map.
These results suggest that a user’s preference for LucidTouch and pseudo-transparency depends on the application.
LucidTouch is currently limited by technology. In order to display the user’s hands and fingers in a semi-translucent fashion, the device currently needs a "boom cam" attached to the back on a small arm. The device sends data back to a computer through USB, where the desktop computer does all of the information processing. The software subtracts the background from the image of the hands and flips it around so that the superimposed image is in the same position as the user’s hands.
The prototype is thus tethered to a PC, limiting the portability of the device. Future iterations could use surface-based sensors to detect and show the finger. The sensing could, in principle, be done with technologies embedded in the back of the LCD, such as with a capacitive array or with an array of LEDs to both illuminate and sense, thus allowing interaction with on-screen objects.
The mouse and variations of the mouse have been the primary input device for many years. The results with LucidTouch suggest that it and perhaps other innovative types of input devices may be a better solution in some devices and applications. If that is indeed the case, then double sided multitouch may become prevalent in commercial products.
To learn ore, a research paper entitled "LucidTouch: A See-Through Mobile Device" will be presented at the User Interface Software and Technology symposium in Newport, RI, from October 7-10. A video illustrating and discussing the use of the LucidTouch has been posted on YouTube. The video can be accessed by the following URL: http://www.youtube.com/watch?v=RsNFZAEssPQ