Chinese prisons and businesses are now using AI 'emotion-monitoring' systems
The number of prison facilities Taigusys counts as clients for its "dynamic emotion recognition" systems.
Government and corporate invasions of privacy are serious problems here in the U.S., but (to our knowledge, fingers crossed) they pale in comparison to the latest innovations in China. As recently reported by multiple outlets, a Chinese company called Taigusys is providing “Dynamic Emotion Recognition” artificial intelligence systems that it claims can accurately monitor individuals’ facial expressions and demeanors for use in wide array of corporate morale, law enforcement, classroom, and anti-terrorism purposes. The product, TGfaceQX, uses AI deep learning and algorithmic analysis to interpret a person’s facial movements, body language, and biometrics, which it then rates based on various categories and scales.
As Insider succinctly and terrifyingly put it yesterday, “The emotion-recognition software says it can also generate reports on individuals to recommend them for ‘emotional support’ if they exceed the recommended markers for ‘negative emotions,’” which we are sure is only being used for positive mental health purposes, and not, say, cracking down on political dissent, worker abuse, and general Panopticon-style monitoring...
...Just kidding. Taigusys’ website practically brags about how much it can aid Big Brother.
Already heavily employed in Chinese prison camps — Last month, a Taigusys general manager happily explained to The Guardian that their company’s AI systems are currently under contract with over 300 Chinese prisons and detention centers. Overall, the estimated 60,000 cameras used at these locations have apparently been extremely helpful at keeping their populations “more docile,” to use the general manager’s exact, horrifying words. Add that to the nightmarish stories coming from these facilities, and well... look, the picture it paints is bleak.
Private businesses are loving the product, too — It’s not just Chinese detention centers and prisons getting in on the most cutting edge privacy abuses; plenty of major companies are ponying up presumably massive amounts of cash for a piece of the Panopticon pie. According to Taigusys’ website, at least 36 businesses can be found on its list of clientele, including heavy hitters like PetroChina, China Mobile, China Unicom, and Huawei — the latter of which has been investing in its own awful, racist AI-based profiling technologies. Unfortunately, such security invasions are already so entrenched within the nation’s bureaucratic and corporate fabrics that Taigusys’ products won’t be going away anytime soon, despite human rights advocates’ repeated reminders that these systems are inherently unreliable and are really just dangerous pseudoscience.