Touchscreens were a revelation. But now that countless devices can tell when someone is poking them, companies are starting to experiment with smarter input methods that go beyond the binary ability to detect physical contact.
Qeexo is one of those companies, and Recode reports that it’s working on a system that differentiates between fingertips, knuckles, and other inputs. The result: Smartphones that take better advantage of the human hand’s capabilities.
The tech is called FingerSense. Right now people can use it to take screenshots by double-tapping a phone’s display, to send an email by drawing the letter “e,” and to perform other functions using their knuckles instead of their fingertips. And those are just the most basic capabilities a system like FingerSense could offer.
Those tools could make it easier for people to perform basic tasks — it’s easier to draw a letter than to go to the homescreen, find the right app, and tap on it with their devices. They could also help people who might struggle with taking screenshots, for example, because they require simultaneous button presses.
Qeexo isn’t the only company trying to make phones more aware. Just take Microsoft’s hover gestures, which notice how someone is holding a device and the way their finger approaches the display to determine what exactly someone’s trying to do.
Apple is also working on better input methods that don’t even require people to actually touch a device to interact with it. The results could change the way people interact with technology.
The human hand is an amazing thing. Learning how to embrace its capabilities, whether it’s in virtual reality or on a smartphone using FingerSense, is the first step in designing the user interfaces of the future.