The American Civil Liberties Union published a bombshell report Thursday outlining evidence that Amazon’s Rekognition, a face surveillance technology that debuted in 2016, disproportionately, incorrectly matches non-white faces to mugshots.
In a test conducted by the ACLU, Rekognition incorrectly matched 28 members of Congress to mugshots in a database the ACLU compiled from 25,000 publicly available arrest photos. In a statement to Inverse, Amazon said the ACLU didn’t use its technology correctly given the nature of the test. (The full statement from Amazon is below.)
Meanwhile, the ACLU claims Amazon is marketing a dangerous, unreliable piece of technology:
“Amazon is aggressively marketing its face surveillance technology to police, boasting that its service can identify up to 100 faces in a single image, track people in real time through surveillance cameras, and scan footage from body cameras,” the ACLU comments in its announcement.
Nearly 40 percent of the incorrect matches were for non-white Congress members, even though they make up 20 percent of Congress as a whole. The ACLU is calling for a moratorium on the use of the software by law enforcement officials, and encourages Congress to take the same position.
“These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance,” the organization declared in a statement released with the results.
Why Amazon Rekognition Software Concerns Police
Amazon describes its Rekognition software as “intelligent video and photo analysis” that can recognize objects, activity, and text in photos, perform facial recognition and analysis, and detect unsafe content. The ACLU notes that the technology is publicly available for low prices — its test costs $12.33, and the default match settings created by Amazon for the public were used. Amazon is “aggressively” marketing Rekognition to police forces, according to the ACLU.
At least two police agencies were piloting the technology this year: the Orlando Police Department and the Washington County Sheriff’s Department in Oregon. On June 25, Florida Politics reported that Orlando had ended its pilot. “Staff continues to discuss and evaluate whether to recommend continuation of the pilot at a further date,” announced the city.
Inverse contacted the Washington County Sheriff’s Department on Thursday to see if the ACLU report had caused it to reassess its use of the Amazon Rekognition. This story will be updated with comment when it arrives.
Amazon’s Full Response to the ACLU Report
Amazon sent Inverse this full statement in response to our questions about the ACLU report. Essentially, Amazon claims the ACLU was too loose with its “confidence threshold.”
“We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft). We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement.
With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test. While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95% or higher.
Finally, it is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgement (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.”
Rekognition can be purchased by anyone with an Amazon account for a monthly fee that ranges from $6 to $12 for law enforcement. Introduced as part of the Amazon Web Services cloud, Rekognition has been used by consumer services like Pinterest, Sky News, and C-SPAN, but serves as a mugshot database for police officers.
Rekognition can also tap into body cameras and municipal security footage. The ACLU previously confirmed that Orlando and Oregon’s Washington County police forces use Rekognition software.
Rekognition Software Has Already Faced Public Backlash
Members of congress identified as convicts by Rekognition include both Democrats and Republicans and men and women. The results are not disproportionately affected by either factor, nor are they skewed toward any age range or location. The only discriminating factor in the ACLU’s experiment is skin color.
Six members of the Congressional Black Caucus were incorrectly identified, and the caucus has previously addressed a letter to Amazon CEO Jeff Bezos urging him to consider the consequences Rekognition could have on black people, undocumented immigrants, and protestors. The ACLU argues that Rekognition could easily inform bias in law enforcement before an arrest process begins, based on false identifications.
Almost 70 other civil rights groups have echoed the ACLU’s concerns, along with Amazon employees, shareholders, hundreds of academic community members, and over 150,000 members of the general public. Members of Congress identified as convicts have yet to respond publicly to the ACLU, but social media attention, including from verified journalists and activists, has already seized on the report.