subscribe

Gesture User Interface Design Advances

The use of gesture user interfaces was discussed by several speakers at TGM. Presenters from AMS and XYZ Interactive addressed the hardware and software necessary to implement gesture user input systems.

For its part, AMS develops and manufactures proximity sensors, as well as 2D and 3D gesture sensing modules, based on light emitters and both infrared and visible light sensors. An AMS 2D gesture sensor module is utilized in the Samsung Galaxy S5 smartphone to implement its gesture input mode. The AMS presenter touched briefly on the design of a gesture user interface including the needs to select the proper sensor and to develop an appropriate gesture vocabulary for each application. The AMS 3D gesture sensor offering is viewed by the company as an enabler for new user input sensing applications in smartphones, tablets, laptops and monitors that would utilize multi-touch, multi-hand, and z depth sensing.

XYZ Interactive has developed software which the firm combines with about $1 of common mass-produced electronic components to implement its 3D sensing subsystem that the company describes as “add(ing) touchless and gesture control to your product for the cost of a regular cup of coffee”. XYZ is pursuing a licensing business model for its branded gesturesense and beaconsense technologies. XYZ foresees its technology being used in wearables, drones, toys, lighting and appliance controls, digital signage and in robotics applications.

A third firm presenting at TGM 2014, Gestigon, approached gesture user interfaces from the human machine interaction (HMI) and interface design standpoints starting from “The Art of Designing Gesture Sets”. After the presenter’s discussion of the elements of gesture element design, he went on to identify issues with today’s current context menus that are unsuitable if not impossible to use with gestures. He then followed with a discussion and illustrations of Gestigon’s CORAL Context Menu Concept and CORONA Gesture Tutorial Concept.

Additional features of the CORAL Context Menu Concept gesture interface include the use of large semi-transparent overlays, “bouncing edges”, and animations to reflect interaction status. The speaker went on to describe and illustrate the firm’s CORONA Gesture Tutorial Concept.

The Gestigon speaker described and illustrated additional gesture user interface modalities making a good case for the need for careful design to achieve a usable and satisfying gesture driven interface. As these three speakers from AMS, XYZ and Gestigon reported, gesture user interfaces have a way to go before they are widely adopted, but the technologies and design efforts required for their implementation are going in place. – Phil Wright