Essentially, Google is creating automatic Cliffs Notes for all the things you, your friends, and colleagues write across its Workspace apps.yuck
Language analysis — For chats in Google Spaces (the Slack-competitor Google crammed into Gmail), Google says it can now analyze what’s been written and generate an easy-to-read summary at the top of each thread, so you can get the 411 on the important things discussed while you were away.
The summary feature also works in Google Docs, integrated into the outline feature Docs has along the side of documents. It seems like an easy way to fake that you’ve read something until you actually have the time to read it yourself (or just never read it).
Google first announced summaries for Google Docs in February 2022, but expanding the feature to other apps in the coming months could potentially be even more useful. I can totally imagine catching up with conversations I missed if I actually used Google Spaces.
If the feature is anything like automatic YouTube captions, there’s likely to be some rough edges, but it’s still an impressive demonstration of Google’s machine learning and language processing capabilities.
Meet transcriptions — Besides expanding the same automatic summaries to Google Meet meetings next year, Google is also adding automatic meeting transcriptions to its video call service later this year. Additionally, Meet is getting machine learning and AI algorithms that can help enhance video call quality for stuff like skin tones and lighting.
I/O announcements — The automatic summaries and transcriptions to Meet form the framework for Google using AI and machine learning to blend the digital with the real world. Google also used its I/O keynote to announce stuff including Android 13, a new “Look and Talk” feature for the Nest Hub Max, the Pixel 6a, Pixel 7 and 7 Pro, Pixel Buds Pro, Pixel Watch, and Pixel Tablet.