If Google made one thing clear at its Pixel product launch on Wednesday, it’s that the company is banking on artificial intelligence to make its devices radically helpful. Perhaps the most important feature for experiencing these A.I.-enabled capabilities is the Google Assistant, which is basically the Google version of Siri — only better.
Nearly every product that was introduced on Wednesday relies on this feature in some way, which is a big departure from when it was initially just introduced on the Google Home. Now, it will also be accessible on the Pixel 2 phone, Google Pixelbook, and even the futuristic new Google Pixel Bud wireless headphones. This is all part of Google’s new mission, which is “A.I. first” rather than “mobile first.”
One obvious way that the Google Assistant is superior to Siri is the mere fact that it’s programmed to work in tandem across various tools and in more ways. Though the iOS11 Siri helps streamline your Mac and iPhone experience, Google just upped the ante by making it so that you can type your questions to its assistant when using the company’s new laptop if you’re somewhere that shouting at your computer doesn’t make sense. Which is basically anywhere in public. The Google family is also just bigger, consisting of a laptop, phone, several smart home devices, and now even headphones capable of realtime translation. This naturally means more integration.
“Having the same assistant at home and on your phone means the experience is connected across devices,” explained Sabrina Ellis, senior director of product management at Google. For example, you can tell the Google Assistant on your phone to broadcast the news that you’re on your way home with pizza. It’ll then automatically share that message through the Google Home in your kitchen with your family.
Google also announced that Google Assistant routines are coming to the Pixel 2 phone. As demonstrated onstage at the event, this could be a pretty big timesaver. The customizable shortcuts might mean telling your phone that you’ve gotten in your car to commute home after work and automatically receiving a traffic-dependent ETA, sending a standard text to your family that you’re en route, and picking up on the podcast you were listening to at the exact spot you left off that morning. Just with one initial command.
Another way that the Google Assistant on the Pixel2 is designed to be superior for on-the-go-to Siri is by being “squeeze activated.” According to Ellis, machine learning was used to train the Pixel 2 to only trigger the Google Assistant when the phone is intentionally squeezed. If this proves effective, that’s a much more convenient and controlled way to trigger your A.I. helper.
Finally, natural language processing aims to make Google Assistant more capable of understanding commands spoken in colloquialisms and by children. Unlike Siri, the Google Assistant also uses “Voice Match” to distinguish between different speakers for a personalized approach. In order to better suit the needs of a variety of users of different ages, accents, and manners of speech, the Google Assistant was trained on over 50 million voice samples.
If you liked this article, check out this video on how A.I. can now predict the future, two seconds at a time.