Google I/O Cemented the Google Assistant’s Spot as the Best Voice Assistant

"Tapping to operate your phone would almost seem slow."

The Google Assistant’s next big update will make it so lifelike, Pixel users might not even have to tap their phones anymore. The tech giant demonstrated the next version of its voice assistant during its I/O developers conference Tuesday, and it could fundamentally change how smartphones are used.

Google’s VP of Engineering Scott Huffman announced it was internally calling the technology the “next-generation Assistant” and that it would be introduced on “Pixel Phones later this year.” His comments were not referring to the Pixel 3a models that also launched Tuesday, but the leaked Pixel 4, which will likely debut in the fall.

“This next-generation Assistant will let you instantly operate your phone with your voice, multitask across apps, and complete complex actions all with nearly zero latency,” Huffman said. “And actions like turning on your flashlight, opening Gmail, or checking your calendar will all work offline.”

Google CEO Sundar Pachai expalins how the company compressed a 100GB machine learning model into just half a gigabyte to fit it into smartphones.


Google CEO Sundar Pachai explained that Google has made these conversational controls possible by compressing its 100-gigabyte artificial intelligence models down to just half a gigabyte, which is small enough for the company to house it completely inside Pixel phones.

This means future Google phones will essentially have an A.I. engine built-inside of them and won’t require voice data to be sent to Google servers. During a brief demo, seen in the video above, the company demonstrated this kind of real-time performance can be used to have the app pull of multi-app commands, reply to emails, browse through Google photos, and even call a ride on Lyft. All of this while only having to say, “Hey Google” once and sometimes not at all.

In one of the demos, the tester — identified only as Maggie in the video — is able to reply to a text message that appeared as a notification by just saying, “Reply” followed by her message. She was able to browse her vacation photos by asking for the ones “with animals in them.” Google Photos already uses A.I. to separate images users take depending on what they’re of, now they’ll be searchable with voice commands.

To take this a step further, Huffman teased the “Personal References” feature, that’ll let users naturally chat with the Google Assistant. The entering information about loved ones and important dates in the “You” tab in Assistant setting, the A.I. will know who’re you’re talking about when you say “mom” or “my son.”

Add information to the "You" tab so the Google Assistant understand who you're talking about when you say "my boyfriend."


“If you shared important people, places, and event to the Assistant, you’ll be able to ask for things more naturally,” he said. “Like, ‘Show me photos of my son,’ ‘Directions to the restaurant reservation,’ or ‘Remind me to pick up chocolates on my anniversary.’”

It’s a massive leap forward in terms of the competition. Alexa and Siri aren’t linguistically flexible, and users are forced to speak the wake command to interact with them. They also often times require users to tap to clarify when they don’t understand certain commands.

Google might soon do away with tapping entirely, which goes to show just how far ahead of the game they are.

Related Tags