A research project at the University of Washington, called Sideswipe, demonstrates a way to control smartphones through gestures – without adding power-hungry components.
Some smartphones today use cameras for 3D gesture sensing. However, they require a clear view of the user’s hands and consume significant battery power. Sideswipe, instead, identifies the interference that hand movements cause in the radio signals being transmitted to and from a phone.
A new type of sensor has been developed that uses the reflection of wireless transmissions to sense gestures. It will enable consumers to control phones even when they are not being held or even visible. Multiple small antennas capture the changes in the reflected signal, and classify them to determine the type of gesture performed.
In a lab test (using a Samsung Nexus S), the system was trialled by 10 people performing 14 gestures each. The phone learned the users’ hand movements and trained itself to respond. Gestures were accurately recognised 87% of the time. The tests were performed when the phone was making a call, as that was the only way to guarantee a constant GSM signal. However, paper co-author Chen Zhao said that an app running in the background could perform much the same function.