Wireless earbuds have become an increasingly essential companion to smartphones. Apple’s AirPods, Google’s Pixel Buds, Samsung’s Galaxy Buds, and countless other similar products have flooded the market with products that let you wirelessly play music or make phone calls. But in the longer term, these wireless earbuds have also provided the likeliest battle ground for where future generations of voice assistant face off.
A majority of earbuds come already with built-in voice assistant capabilities, which allow users to talk seamlessly with Google Assistant, Siri, Bixby, and, soon, Alexa. As the artificial intelligence inside smartphones becomes more capable over time, tech companies ever greater options for how to make these devices useful.
In the next few years, Ben Arnold, the senior director of innovation and trends at the Consumer Technology Association tells Inverse that A.I.-enabled earbuds will bring about a flurry of new use-cases that stem from the ability to interact with your phone without having to take it out of your pocket.
“These types of applications will become more refined in the next five years and digital assistants will become a fixture in our daily use of headphones and smartphones,” he said.
There’s already evidence that tech companies want to make that happen. Google and Qualcomm announced a partnership to release the “Qualcomm Smart Headset Development Kit.” This would make it easier for developers to add custom voice-assistant interactions that tech companies have yet to create.
The foundation for talking A.I.-infused headsets is already laid, with the global wireless headphones market is expected to be valued at $34 billion by 2024. That’s a lot of consumers who have gotten used to interacting with (and not losing) wearable buds.
Here are three ways wireless buds will continue to push boundaries and differentiate from one another.
3. Navigate Smartphone Only Using Voice
The evolution of buds from peripheral to tech-powerhouse in their own right begins with the voice assistant. A few basic hands-free capabilities already exist, like voice composition for text messages or the ability to set reminders or alarms. But these just scratch the surface of what the future holds.
“I like to think of connected, intelligent headphones and earbuds as extensions of the smartphone,” he said. “You can accomplish some of what you can using traditional phone apps, but you have the convenience of doing it basically without having to touch a device. Things like opening apps, reading and responding to messages, maybe even snapping photos.”
Instead of checking social media apps and feeds multiple times a day, thi voice assistants will likely some day begin providing digests and better integrations. Imagine getting run-downs of your friends and family members’ social feeds, or have article summaries read that are read to you without having to open them. Better integrations would allow, say, the Google Assistant to inform you that an Amazon order has been shipped.
These are small steps that’ll open the door to more advanced capabilities over the next few years.
2. A Mobile Shopping Assistant
One of Alexa’s biggest selling points is that it lets users shop and place orders with simple voice commands. Google has rolled out a similar feature, but Arnold says its only a matter of time before these abilities find their way into mobile tech as well.
When earbud owners are walking around town running errands, their voice assistant could point out stores offering sales on items of interest.
“[We could see] a shopping assistant that doesn’t just live inside my smart speaker, but is mobile, location aware and could point me to a store or restaurant with a good deal,” he said. “Maybe when I’m walking in the mall and based on my preferences it will make relevant suggestions like, ‘Gucci is having a sale and I know you’ve been looking at Gucci belts.’”
This could be very helpful for tourist looking for a spot to grab a bite or buy a souvenir. But aside from making shopping easier, A.I.-enabled earbuds could solve one of the traveling’s biggest obstacles: language barriers.
1. Shattering Language Barriers
Touring different parts of the world can be difficult without a guide. Google Lens already gives users the ability to translate signs by taking a picture of them. Arnold believes wireless earbuds could soon live translate conversations into users’ native tongue.
“Having just spent a week in China, something I have been thinking about is realtime translation,” he said. “I know that is possible through certain apps, but what if my earbuds could automatically translate any foreign language it hears into English on the go.”
The team behind Google Duplex — the restaurant reservation making A.I. — has already said that this capability is one of their long-term goals. They want to allow anyone who calls a business in a country that doesn’t speak their language to be able to use the Google Assistant as a translator. That doesn’t involve earbuds, but the tech company already offers an accessibility feature that could make a Pixel Bud translator possible.
Android Q’s Live Caption feature can meticulously types out every word spoken in a call or video. The next steps would be to have it translate the text and then dictate the translated version to the listener.
Google also released a beta of its Live Transcribe capability in February, which is almost identical to Live Caption but types out words spoken in a text file instead of overlaying them over videos and apps.
The technology prefacing these capabilities are already here. Plus, the Google and Qualcomm partnership will allow third-party developers to speed up the process of developing ever more versatile voice-assistant features. Earbuds could soon become more ubiquitous than they already are.