Tech

Staten Island's DA caught paying for Clearview AI technology

Facial recognition software has been tied to wrongful arrests.

Security camera below a blue sky.
lupengyu/Moment/Getty Images

Another day, another law enforcement agency found using Clearview AI's facial recognition software. This time it's the Staten Island District Attorney's Office, which Gothamist has discovered agreed to a one year, $10,000 contract with the startup in May 2019.

It's been well reported that Clearview AI is controversial. The big "innovation" the company introduced was trawling social media sites like LinkedIn to collect billions of pictures of people's faces. Law enforcement can upload a picture of a suspect to the program and boom, Clearview's AI spits out a predicted match.

False arrests — Police argue that facial recognition software can help bring justice faster. But critics say that a lack of any real oversight means facial recognition software can reinforce the overpolicing of communities of color. Making matters worse is that the technology disproportionately misclassifies people of color, which has led multiple people to be falsely arrested.

In New Jersey, the use of facial recognition software was temporarily suspended last year after a man, Njeer Parks, spent ten days in jail when the technology erroneously placed him at a crime scene. After a judge asked prosecutors to produce more evidence beyond the facial recognition identification, the case was quietly dropped.

Staten Island District Attorney Michael McMahon declined to comment for Gothamist's story, and, somewhat laughably, his agency told the non-profit Legal Aid Society that providing copies of its searches run through Clearview's program, "would constitute an unreasonable invasion of personal privacy." The unreasonable search and seizure of people based on no probable cause is also an unreasonable invasion of privacy, no?

Serious consequences — Wrongful arrests can have serious consequences for the accused. Beyond stress, they can lead to financial harm in the form of legal fees — Parks spent thousands of dollars on a lawyer — and lost wages. The innocent can also end up with criminal records if the error is never spotted.

Civil rights groups have increasingly called for federal legislation that governs the use of facial recognition to make arrests. They argue that police should be prevented from using the technology alone to make arrests so people don't live in constant fear whenever they step outside. And the technology should be required to go through independent evaluations for accuracy. These aren't algorithms predicting what food recipes you might like, and shouldn't be treated in the same light as something to be improved through real-world trial and error.

New York Senator Brad Hoylman has introduced a bill that would halt law enforcement’s use of facial recognition for several years until a task force could come up with guidelines for its use.

Clearview has said it plans to have its software tested by the National Institute of Standards and Technology (NIST), a federal body which has tested the accuracy of other facial recognition programs. NIST has also tested the accuracy of facial recognition to identify people through masks, and concluded it doesn't stand a chance. Something to keep in mind once the pandemic is over.