Science

Researchers Reveal How Secret Commands Could Hijack Siri

The rise of virtual assistants opens consumers up to attack from secret messages that humans can't understand.

Getty Images / Oli Scarff

Researchers have revealed how secret commands could use voice-control tools like Siri and Google Now to take over your smartphone without your knowledge.

“While ubiquitous voice-recognition brings many benefits,” the researchers write in a paper to be presented in August, “its security implications are not well studied.” So the team from Georgetown University and the University of California, Berkeley ran a series of tests to see just how easily these assistants could be tricked and, once they showed that a phone could be forced to activate airplane mode or call 911, tried to figure out how to defend against these attacks.

The researchers were able to make these virtual assistants respond to a command that humans can’t even understand. (It sounds a bit like a cross between a child’s attempt to speak a full sentence and the noise made by a broken coffee grinder.) This is because virtual assistants are supposed to be able to understand as many people as possible, which means they can guess at words even we can’t parse.

The researchers found that the machines were better at understanding commands than humans all the time — with one exception — and are much better at detecting them when the commands have been obfuscated. Chances are good that your virtual assistant does a better job understanding human speech than you do, which means you might never know if it’s been given a command.

A graph taken from the research paper showing the rates of success for understanding a voice command.

Hidden Voice Commands

This is a minor nuisance if someone commands your phone to turn on airplane mode. It could be a bigger problem if they’re able to open a web page used to distribute malware, or to make the phone share information that it isn’t supposed to share. And right now there’s no way to defend against these types of attacks without disabling Google Now, Siri, or whatever virtual assistant is on your phone.

“We are unaware of any device or system that currently defends against obfuscated voice commands,” the researchers write in their paper. They suggest some possible defenses — audio CAPTCHAs, helping virtual assistants detect the differences between natural or synthesized speech, notifying users when a command is issued — but none of them are currently available to consumers.

So watch out the next time you decide to watch a random YouTube video. You never know when someone might take advantage of one of your phone’s flagship features to irritate you at best and compromise your phone’s security at worst.

Read the full paper, “Hidden Voice Commands,” below:

Related Tags