China

Huawei worked on facial recognition tech that could pick out Uyghurs

Facial recognition and discrimination go hand-in-hand.

Anadolu Agency/Anadolu Agency/Getty Images

Working with research organization IPVM, The Washington Post found a damning document regarding a Huawei facial recognition project. The document indicates that the company worked with startup Megvi to test an AI camera system that could parse people’s age, sex, and ethnicity in 2018. If the system determined a face was of Uyghur (also spelled Uighur) origin — a persecuted, largely Muslim minority — it would trigger an alert to potentially notify Chinese police.

A different kind of bad — Last year, after mounting international pressure, China announced that Uyghurs imprisoned in “reeducation” internment camps had graduated. Another Washington Post report from earlier this year, however, revealed that many had simply been transferred to factories where they worked on products for the world’s biggest brands.

Companies generally feign ignorance at the presence of human rights violations in their supply chains, but Apple recently put some skin in the game. The company is lobbying against the Uyghur Forced Labor Prevention Act which aims to bar the import of goods produced from forced labor in China and impose sanctions on those responsible. WaPo’s reporting from March also detailed how those forced into working at factories were under constant surveillance.

Private eyes are watching minorities — While Huawei maintains that its system was never deployed, its existence, even as a test is a troubling development for global facial recognition applications.

"China’s surveillance ambition goes way, way, way beyond minority persecution,” Maya Wang, a China senior researcher at Human Rights Watch told The Washington Post, citing protests and other perceived threats to the state, before adding, “the persecution of minorities is obviously not exclusive to China ... And these systems would lend themselves quite well to countries that want to criminalize minorities.”

The U.S. is no stranger to how facial recognition algorithms have been used to police Black and brown communities, despite the technologies’ inherent biases and inaccuracy. While we were sliding towards fascism, there are hardcore fascists and dictators across the world who would love software that can identify minority groups for the purposes of imprisonment or outright genocide.

The Trump administration’s fixation on China has clear roots in America’s homebrew of xenophobia and racism, but there’s merit to setting consequences for China’s behavior — and that of the Fortune 500 companies benefiting from worker exploitation.