This came today as part of the ACM news. I wish I had the skills to do something like this. Sadly, the logic of programming and time are against me.
A combination of simple bio-acoustic sensors and some sophisticated machine learning makes it possible for people to use their fingers or forearms — and potentially, any part of their bodies — as touchpads to control smart phones or other mobile devices.
The technology, called Skinput, was developed by Chris Harrison, a third-year Ph.D. student in Carnegie Mellon University’s Human-Computer Interaction Institute (HCII), along with Desney Tan and Dan Morris of Microsoft Research. Harrison will describe the technology in a paper to be presented on Monday, April 12, at CHI 2010, the Association for Computing Machinery’s annual Conference on Human Factors in Computing Systems in Atlanta, Ga.
The full article is available at CMU.edu.
PhD student, Chris Harrison, has a website about his project. He says:
Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, proprioception (our sense of how our body is configured in three-dimensional space) allows us to accurately interact with our bodies in an eyes-free manner. For example, we can readily flick each of our fingers, touch the tip of our nose, and clap our hands together without visual assistance. Few external input devices can claim this accurate, eyes-free input characteristic and provide such a large interaction area.
I look forward to reading the full text.