We tend to talk a lot about the ways technology has invaded our privacy, but for one couple near Albuquerque, New Mexico, this might have been a good thing. A Google Home device accidentally alerted authorities to an alleged domestic dispute on July 2.
Eduardo Barros was house-sitting with his girlfriend and her daughter when the couple got into an argument that turned physical, according to local police. Barros allegedly held up a firearm and threatened to kill his girlfriend, asking her, “Did you call the sheriffs?”
The Google smart speaker mistook his question for a voice command and called 911. A crisis negotiation team and a SWAT team were sent out to the home and Barros was taken into custody after an hours-long stand-off. The woman had sustained minor injuries but her daughter was unharmed.
In a statement to ABC News, Bernalillo County Sheriff Manuel Gonzales III said:
“The unexpected use of this new technology to contact emergency services has possibly helped save a life. This amazing technology definitely helped save a mother and her child from a very violent situation.”
Despite the clearly positive outcome of this situation, the fact remains that devices like Google Home are constantly listening in — for better or worse. In 2015, the Electronic Privacy Information Center sent a letter to the FTC warning of the implications of “always on” devices that constantly record their owners “and may constitute unlawful surveillance under federal wiretap law.”
And while they named technology like Alexa that’s used in smart home devices their letter, gadgets like Amazon Echo and Google Home are only kind of “always on.” They don’t record anything before wake words (like “Ok, Google,”) or commands are uttered, and they only record small snippets of speech, not entire conversations.
There are also settings options that can limit a smart home device’s chances of being accidentally used. You can get a device to play an audible tone when it’s been activated, so you remain aware that it could be recording. You can also program the device with passcodes for certain actions. Or you can just manually turn the microphone off.
When the microphone is on, however, law enforcement is likely to become increasingly interested in what these devices can pick up. Earlier this year the defendant in a murder trial in Bentonville, Arkansas agreed to let Amazon share information stored on his Amazon Echo that was found near the scene of the crime. Police had subpoenaed Amazon hoping information stored on the device would lead to more information about the crime.
Much like security cameras and cell phones, it seems like smart speaker devices have the potential to play a serious role in future criminal evidence. Forget The Wire, what about The Echo?