Predictive Policing Is Far More Prevalent Than We Once Thought: Report

In cities across the country, just calling the police contributes to a more insidious data web than we previously realized.

Flickr / davidsonscott15

Knowing how to protect ourselves from online data breaches has become an increasingly crucial knowledge for participating in online life: We (hopefully) change our passwords, and diligently check the two-part verification box. But a new report has revealed that in cities across the country, just calling the police contributes to a larger, more insidious data web than we ever previously realized.

Dozens of cities across the U.S. have been found to hold previously unconfirmed contracts with PredPol, a “crime-predicting” tech company who has compared their software to a “broken windows” policing strategy. The affected communities, including Atlanta, Palo Alto, Calif. and Tacoma, Wash., are collectively home to over 1 million residents, according to documents obtained by the outlet Motherboard through a series of Freedom of Information Act requests.

Predictive policing uses data gathered by police departments in an attempt to predict when and where crimes will occur. If that tactic sounds like wishful thinking, well, yeah, it is, at least essentially. Much of predictive policing evolved from the “broken windows” policing strategy, which first gained prominence in the 1980s. The now-widely-criticized criminological theory argues that signs of crime, disorder and anti-social behavior - like, literally, broken windows - creates an urban environment that fosters further disorder and crime, to a more serious degree. The solution was to crack down on petty crime, in an attempt to thwart the future serious crime. What happened instead was over-policing of minority communities.

Predictive policing attempts to use data to predict where crime will happen, but typically results in simple over-policing of minority neighborhoods. . 

Pip Wilson/Flickr

How the Prevalence of Predictive Policing Went Underreported

Predictive policing supporters argue, though, that data is data. It’s numbers. It’s infallible. But of course, A.I. is still built by humans, who routinely imbue their seemingly objective systems with racial bias. Predictive policing is no different. As early as 2016, civil rights advocates, including the ACLU, voiced concerns about the growing use of predictive policing software, arguing that it leads to over-policing of poor and minority communities.

“Systems that are engineered to support the status quo have no place in American policing,” reads a statement signed by 17 civil rights organizations, including the ACLU and the NAACP. “The data driving predictive enforcement activities — such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls — is profoundly limited and biased.”

One of the major complaints of predictive policing is that it relies exclusively on reported crime. It doesn’t — it can’t — take into account the actual social environment of a community. Also, predictive policing only “predicts” street crime. The software doesn’t track white-collar crime. It doesn’t station FBI agents on the corners of Wall Street to hunt down individuals who look mighty hungry to commit some insider trading.

PredPol, based in Santa Cruz, Calif., claims to have an algorithm that can predict crime in 500 by 500 foot areas. In some cases, they say, they can even pinpoint specific block corners or houses. Using “big data and AI,” the company also promises to “predict where and when adverse events are likely to occur: crimes, collisions, overdoses - we do it all,” according to their Twitter bio.

According to Motherboard’s documents, PredPol has held — and in a number of cases, continues to hold — dozens of previously unreported contracts with cities across the country. The lack of transparency falls to PredPol’s position as a vendor. Many of their files are, technically, public knowledge; but communities, particularly those falling prey to over-policing, don’t know where to look.

Related Tags