Using Skinput, a person could tap their thumb and middle finger together to answer a call; touch their forearm to go to the next track on a music player; or flick the center of their palm to select a menu item. All of these sign-language-like movements, which are customizable, would control a gadget in a person's pocket through a Bluetooth connection. When fitted with a pico-projector, the Skinput system could display an image of a digital keyboard on a person's forearm. So, using Skinput, someone could send text messages by tapping his or her arm in certain places -- without pulling the phone out of a pocket or purse.
The system, which has been under development for eight months, won't be commercially available for two to seven years. Before that can happen, Skinput's sensors need to get more accurate. In a 13-person trial in Seattle, Washington, Skinput was found to be 96 percent accurate. But that test only used five buttons. The system would have to improve for people to make use of a full keyboard. Skinput is one of a number of prototypes, ideas and near-products aiming to make computing more natural. These devices seek to move beyond the mouse and physical keyboard, letting people communicate with their gadgets by gesturing, using sign language or, in the case of Skinput, tapping on their hands, fingers and forearms. Who can imagine what's next?