Amazon has placed a one-year moratorium on police use of Rekognition

The company won't let the police use its facial-recognition tech for a year. Hopefully during that year it'll be convinced to extend the ban on use.

Futuristic and technological scanning of the face of a young african man for facial recognition and ...

Amazon today announced it's implementing a one-year moratorium on police use of Rekognition, the retail and cloud service giant's facial-recognition technology.

"We’re implementing a one-year moratorium on police use of Amazon’s facial recognition technology. We will continue to allow organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Rekognition to help rescue human trafficking victims and reunite missing children with their families," the company said in the blog post on Wednesday.

"We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested," the statement continues.

Good timing — The controversy surrounding facial-recognition technology is nothing new, but it's felt especially important in the last two weeks as protests in response to the murder of George Floyd by Minneapolis, Minnesota, have been met with further violence from police and citizens have been seeking ways to document protests without endangering one another.

Earlier this week, IBM announced it would cease work on its facial recognition tools and called for sweeping reform of the U.S. police system. And last month it immerged that multiple companies working in the facial-recognition sector have been using publicly available images, like those posted to Instagram, to train artificial intelligence systems to account for face masks.

Only one of the players — Amazon is only one of the players in the burgeoning space for identifying strangers from images of them. Earlier this year a company called Clearview AI grabbed headlines when it came to light it was scraping the open web and social media for images of people and then building an enormous database and selling it not just to law enforcement, but to well-healed individuals who could use it for everything from vetting prospective employees to tracing would-be suitors of their offspring or harassing strangers.

Hardware needs regulation, too — Amazon's move is the correct one — which is surprising, given the company's serial propensity for ignoring ethical pressure when choosing its course of action — but it should also encourage legislators to consider the hardware side of facial recognition. Sony, for instance, has been working on imaging sensors that would be able to do on-device facial recognition. Amazon, meanwhile, has a highly questionable track record when it comes to its Ring security camera and video doorbell subsidiary. If privacy is to be meaningfully protected, it's going to be necessary to legislate not just software makers, but also those building the tools they use.