Science

Alexa Recording Conversations: Amazon Explains How It Actually Happened

It was a snafu, but something is fishy.

Amazon’s Alexa may be a top-rated home assistant with countless useful functions, but the idea that a smartphone device is listening in on you can send a chill down your spine. On Thursday, a Portland, Oregon, woman who gave her name only as Danielle told KIRO 7 News that the virtual assistant had recorded a private conversation she had with her husband and sent it to one of their contacts, unbeknownst to them. (Yep. Far creepier than this laugh.)

An Amazon representative later told Recode that this was a total fluke and that the company will work to ensure it never happens again. The spokesperson explained this snafu was a chain of misunderstandings, but even then it calls into question how secure the Amazon Echo truly is.

So how did this happen, exactly? If you own an Echo, what should you know to make sure you don’t accidentally send a friend some very personal audio?

Smart speakers are made to listen to everything you say but only kick into gear when they hear their trigger phrase, which in this case would be “Alexa.”

According to an Amazon statement to Re/code, Danielle’s Echo apparently mistook a word in the background conversation as its cue. It then subsequently misunderstood another phrase as “send message,” which prompted it to record and save the private discussion to be sent as a voice note.

Flickr / bestaiassistant

Alexa then, allegedly, asked out loud, “To whom?” but Danielle didn’t hear it. From there it interpreted another snippet of the conversation as the name of one of Danielle’s husband’s employees. Finally, Alexa asked out loud, “[contact name], right?” And then, for the fourth time, misunderstood the background conversation, mishearing a phrase as “right.”

“As unlikely as this string of events is, we are evaluating options to make this case even less likely,” said the Amazon spokesperson.

But there is something fishy about this chain of unfortunate events. Not only did the Echo somehow manage to misunderstand four phrases or words, but it also allegedly said two things out loud, which generally garners a response from people in the room that weren’t expecting Alexa to intervene in their conversation. This would mean Alexa wasn’t in earshot, but the KIRO 7 report stated, “every room in her family home was wired with the Amazon devices to control her home’s heat, lights and security system.” Danielle said she never heard Alexa state it was preparing to send a message.

As smart-speaker technology becomes ubiquitous and capable of more nuanced interactions, it is paramount that companies ensure their devices aren’t haphazard with personal information. Events like this might be rare but could leave a big black eye on products that rely on trust for widespread adoption.

Related Tags