Raleigh police cut its Clearview AI subscription short amid criticism

The service went beyond the local facial recognition policy.

picture alliance/picture alliance/Getty Images

Raleigh’s WRAL reports that the local police department has stopped using Clearview AI. The North Carolinian police force spent $2,500 on Clearview accounts for a year of service, according to a statement released on Tuesday. Six months in, following mounting pressure from WRAL, the department looked into Clearview’s scope and determined it was not in line with a facial recognition policy established in 2015. The Raleigh police no longer have access to Clearview and no plans to resume use in the future.

What happened? — The Raleigh Police Department is no stranger to facial recognition and has a $50,000 annual contract with South Carolina’s DataWorks. Its 2015 policy has restrictions on who can access the system and how it can be used, including the prohibition of using social media photos not in the public domain.

"That is a much more limited universe of photographs than Clearview," Ann Webb, policy counsel for the North Carolina chapter of the ACLU, told WRAL. "This suggests that using Clearview may go well beyond the existing policy, meaning it's essentially unregulated in Raleigh."

Clearview’s software was only accessed by three department employees in human trafficking cases or as a last resort in other “serious crimes.” Earlier this month, the department reached out to Clearview “in an attempt to gather information about past use of the system for internal auditing purposes." The response was deemed “unsatisfactory,” so the AI will no longer be used and the department is updating its facial recognition policy.

What does this mean for Clearview? — Clearview has maintained both its legality and veracity as its social-media-scraping AI has garnered more attention. The spotlight could dissuade new police departments from using the software, but this latest update indicates the startup could start losing clients as its reputation suffers.