check your bias

Banned: why facial surveillance is just the tip of the iceberg

Facial recognition is on the outs, but similarly biased tech remains.

Updated: 
Originally Published: 
Woman scanning face with facial recognition system on smartphone, with rays coming out of the phone ...
Shutterstock

After weeks of protesting against racial injustice following the murder of George Floyd and calls for defunding and divestment from the police, Boston became the latest city (and largest on the east coast) to ban facial recognition software for surveillance.

With its city council delivering a veto-proof majority, the Massachusettes state capitol joined joins its neighbor, Sommerville, and cities across the country like Oakland and San Franciso in creating distance from this technology which Councilor Ricardo Arroyo described as having “an obvious racial bias.”

While this move may look progressive on its face, Kade Crockford, director of the ACLU of Massachusettes' Technology for Liberty Program, tells Inverse that activists shouldn't be resting on their laurels just yet.

"Police departments use many different surveillance technologies in ways that unfairly target Black and brown people," Crockford says. "License plate readers, surveillance cameras, social media surveillance, and even drones are just some examples.”

"The Boston Police Department has taken a 'deploy first, answer questions later' approach to surveillance technology, [but] face surveillance is too dangerous."

But let's take a step back, how are these seemingly objective algorithms resulting in racial profiling and violence against BIPOC and POC?

It's no secret that algorithms are far from the shining beacons of objectivity. Just as a parent will pass down their unchecked and biased opinions to a child, programmers are guilty of coding their biases directly into the fabric of these algorithms through the data they use and assumptions they make. At an earlier hearing about the technology, Boston Police Commissioner William Gross noted that "I'm African American and I can be misidentified as well" by facial recognition.

While biases may not be intentionally coded into algorithms, the results can be detrimental nonetheless. When it comes to racial profiling through facial recognition software, an MIT study found that these algorithms had a 35 percent error rate when it came to identifying darker-skinned people. This error, which springs from the fact that these algorithms are simply not trained to recognize enough images of diverse faces, has already led to the wrongful incarceration of a Black man, Robert Williams, in Detroit.

We like to believe that our technology is just and objective, but that dream is far from a reality because our tech is encoded with our own biases and prejudices.

Shutterstock

In Boston's case, facial recognition software was not currently in use by the city's police departments, but Crockford says the risk for abuse with this technology was too high to simply wait for an incident to happen.

"For too long, police departments including the Boston Police Department have taken a "deploy first, answer questions later" approach to surveillance technology acquisition and use," says Crockford. "But face surveillance is too dangerous."

However, while this proactive movement toward divestment in this racially-biased technology is in itself a good thing, the question remains of what other racially biased technologies used in Boston -- and across the nation -- will remain unchecked following this public change.

In addition to the other racially biased technologies mentioned by Crockford, like license plate readers and social media surveillance, BIPOC and POC also face discrimination from technology as mundane as automatic soap dispensers to crucial technologies like speech recognition and automated housing and immigration systems.

Compared to facial recognition, which has already been the subject of discomfort separate from its racial biases, racial bias baked within these other systems is much less visible and in turn much harder to address.

In Boston, Crockford says the Boston City Council has proposed legislation that would make all state government surveillance technologies subject to "democratic debate, city council oversight, and public transparency," which would include surveillance technologies such as cameras and license plate readers. Crockford also tells Inverse that the ACLU of Massachusetts plans to work with the state legislature to introduce a bill by late July that would extend the facial recognition ban from the state's urban center to smaller towns as well.

The gears of bureaucracy are slow-moving, says Crockford, but the ACLU of Massachusetts—and activists everywhere—are ready to shine a spotlight on to these lesser-known technologies.

Inverse Analysis

Recognizing the potential for harm, in particular to BIPOC and POC, from facial recognition technologies is an important step toward addressing the racial biases and disparities upheld by our technology. But, this technology is also one that we're all familiar with from simply using our phones. Going beyond (literally) visible biased technology and focusing on more complex and less tangible systems being perpetuated by biased technology will be the larger battle and one that needs continued and unrelenting attention.

This article was originally published on

Related Tags