Google’s latest hardware experiment combines smart textiles — in this case, braided fibers — with machine learning to create a user interface capable of distinguishing between different kinds of touch. That tech could be used for gesture-sensitive headphone cords, garment drawstrings, and other wired connections. This isn't Google's first foray into smart materials: It's longstanding Project Jacquard has continued to refine what interactive or smart clothing might one day look like.
Google published its research, which builds on the company’s previous e-textile architecture, to the Google AI Blog last week with detailed information about how the tech works and its potential further development in the future.
The device is currently being called an “I/O Braid” and it’s capable of distinguishing between six main types of discrete motion: twists, flicks, slides, pinches, grabs, and pats. All in what appears from afar to be no more than a traditional headphone cord.
The I/O Braid is still very much an experiment, but it’s already showing promising results. Google says it’s already seen a gesture recognition accuracy rate of about 94 percent — quite impressive, given how new the technology is. The implications for the future of consumer tech are murky, though. It's going to take a lot more testing before this makes its way into our headphones or our homes.
The power’s in the braid — Touch-sensitive wires aren’t a new concept, but Google’s latest experiment furthers the possibilities for gesture recognition by using braided conductive strands. The result of this, which Google refers to as the Helical Sensing Matrix (HSM), allows for a much larger gesture space than a classic capacitive touch sensor.
HSM takes simple touch surfaces and forces them to interact in a 360-degree space. The intersections of the conductive yarns allow for the braid to detect motions like twisting and sliding. Google has also taken advantage of that braided design to create light-based feedback with the illusion of motion.
Experimental for now — Don’t expect to go out and buy headphones with this fancy smart cord any time soon; this is still very much work-in-progress. But the results have already been very promising. This first round of testing only used data from 12 participants, and already developers have been able to program six easy-to-learn gestures.
There are practical challenges — Attaching the I/O Braid to an actual pair of consumer headphones could pose problems — we’re rarely sitting stock-still, which could pose issues of accidental touches and twists. What if you wanted to wear the headphones while exercising? Google’s initial research only really explores how to best train the braid with machine learning. Future research will need to focus more on how usable the technology would be for the average consumer.
So there’s still much testing to be done. We won’t be seeing this on store shelves for a while, assuming we ever get the chance to buy it at all. But it's a tantalizing glimpse at the touch-based UI innovations that are still to come.