Tech

Cops really do use warrants to track down people Googling suspect terms

"Reverse warrants" let police demand information on everyone searching specific keywords in Google.

NORTH PORT, FL - SEPTEMBER 20: An FBI agent talks with a North Port officer while they collect evide...
Octavio Jones/Getty Images News/Getty Images

Law enforcement investigators across the United States have been using “reverse warrants” to demand bulk information from Google that might help in criminal cases. Forbes earlier this week reported on a particular human trafficking case in which investigators using a warrant requested Google provide information on anyone who searched the victim’s name, hoping the results would lead to a suspect.

Shady activity — The warrant was meant to remain confidential but was accidentally unsealed, suggesting that law enforcement didn’t want the public to know about its behaviors. And it’s not hard to understand why. You typically expect investigators will, well, investigate, using methods like talking to people at or connected to a crime scene. Instead, in the information age, we see law enforcement getting lazy and saying, “let’s just Google it.” They want to collect bulk amounts of data and figure out how to sift through it later.

Facial recognition is another example of this, where a photo from a crime scene is thrown through an algorithm that tries to cross-match it with pictures collected elsewhere, like from Facebook.

Catch-all — The problem with these systems is that they can and do make mistakes. As Techdirt writes, it’s certainly likely that Google has information relevant to an investigation — everyone uses Google these days, after all — but that’s not probable cause to scoop up all data in bulk. Another incident that Techdirt cited where reverse warrants were used was when, following a string of bombings in Austin, Texas, keyword warrants were used to collect IP addresses of anyone searched for terms like “pipe bomb.”

That’s a wildly broad search request that is surely going to turn over completely innocent people. Exactly the type of issue you don’t want to create: a surveillance state where everyone is always a suspect and has to be careful what they type. Similarly, an algorithm using a probability score to “match” a surveillance camera shot to a picture of a person gathered from Facebook shouldn’t be enough to make an arrest. Without more concrete information, you’re going to have a greater chance of arresting the wrong person.

Hopefully more attention is given to this issue. Law enforcement should not be able to make bulk request keyword requests like this.