In a long-anticipated announcement, Google has invited the public to “Come see a few new things Made by Google” on October 15. The “new things” in question are rumored to include the Pixel 4 and Pixel 4 XL phones, as well as a Pixelbook 2 laptop. The event, which will be live-streamed from New York has another long-awaited debut.
Expected to be in attendance at Google’s annual October event is Project Soli, a radar chip for touch-free motion control that Google originally teased at I/O in 2015. “The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects.” Here’s that debut:
You might even say it’s _finally_ going to be here. Let’s dig in on what this cool piece of innovative tech can do, shall we?
The 8mm x 10mm, low-energy chip is the brainchild of Google’s personal technology incubator, Advanced Technology and Projects (ATAP), and uses a broad beam of electromagnetic waves to sense and capture different parameters of a person’s gesture, such as location, size and velocity, based on the electromagnetic waves are reflected back. Yeah.
If you think about the classic pin art game — see the example above? — Soli’s beam of waves comprise the pins, and the imprint of your face is the data it collects about your gesture.
We learned in January that Google had received approval from the U.S. Federal Communications Commission to operate its Project Soli sensors at a power high enough to enable long range, real-time motion detection, and since then Google itself has confirmed in a blog that the Pixel 4 will be the first device to integrate Soli (despite Wear OS smartwatches already having a similar gesture control feature.)
While the Pixel 4 version of Soli is unlikely to provide the same creative freedom as Tony Stark’s computer set-up, in its blog Google says users will be able to “to skip songs, snooze alarms, and silence phone calls, just by waving your hand” using button, dial and slider motions.
It’s unlikely that Google’s first introduction of Soli will revolutionize the way we use smartphones, but it — and the teased introduction of facial recognition with the Pixel 4 as well — are making a clear statement about the company’s priorities when it comes to creating a touch-free experience for their users. A perspective the company itself refers to as “ambient computing.”
In what is essentially another way to say the Internet of Things (IoT), not to mention a way to sell more Google products, Google’s chief of hardware Rich Osterloh told The Verge earlier this year that he views ambient computing as a way for users to be connected to and get help from any of a variety of devices that are around them everyday.
While this description makes the tech sound airy and benevolent, the IoT has been riddled with security nightmares throughout its development that have shown the dangers associated with sacrificing personal information for convenience. This includes features like facial recognition which several cities around the U.S. have recently banned for fears of data abuse.
There are fewer privacy concerns related to something like motion control — although scientists do have ways to identify individuals based on their gait. But by effectively integrating our bodies themselves into our devices we move another step closer to total inter-connectivity, for better or for worse.