Apple is adding a new dimension to the iPhone. According to a recent patent awarded to the San Cupertino-natives, Apple has developed a non-contact user interface that allows you to interact with your touch-phone without actually touching it.
Currently, phones employ non-contact or proximity sensors to determine when you don’t mean to touch the screen, like when your cheek bumps against it on the screen on a phone call. This feature requires a single proximity sensor near the top of the phone. A complete non-contact interface simply deploys similar sensors all throughout the display, on a “per-pixel” basis, to get a close reading of any nearby movement.
Whether it’s flipping through cable channels from your couch without even needing a remote, or browsing the internet while you’re cooking and don’t want to clean your hands, proximity interfaces could spell the way of the future. Apple was able to file this patent because of technology it picked when it purchased an Israeli company called PrimeSense in 2013.
The sensors work by ejecting tiny lasers that bounce off anything nearby. The screen then picks up the reflected laser and determines the location and velocity of the proximate object. The technology could work over longer distances too.
These new user interfaces could change more than just iPhones.
The keyboard, touchpad arrangement has long held back computer gaming. A non-touch interface could multiply the number of the commands a user could input, possibly allowing desktop gamers to finally catch up to the joy-stick controllers in multiplayer. No more WASD just to walk around — simply hover over your non-touchpad to escape a headshot.
Sure, keyboards have become thinner and sleeker, but they’re still just about the bulkiest and most old-fashioned part of any computer. They get dirty and jam, and you can only have a single set of keys for the whole life of a computer. A touchscreen would go a long way, but a non-touchscreen would boost the keyboard’s lifespan and keep it clean at the same time.
The firm PrimeSense was best known for its 3-D rendering technology, allowing a user to transform a 2-D picture into a 3-D digital world. We could imagine this technology complementing the current model of virtual reality. A proximity sensor reads your every body motion and adjusts the images in your VR headset accordingly.
The non-touch technology does seem a little far in the future for the iPhone, since the 3D touch feature it introduced on the iPhone 6 has failed to find developers to take advantage of it. So maybe Apple is just doing its best to patent the future before it becomes a (virtual) reality.