Tech

Facial recognition companies are selling the ability to ID you through your mask

The companies have adapted their systems to contend with the now ubiquitous face coverings.

RAUL ARBOLEDA/AFP/Getty Images

If there's one good thing to come out of the coronavirus pandemic, it's that creepy facial recognition software is struggling to identify people through their masks. But alas, the makers of such products are racing to fix that.

CNET spoke to several companies that sell facial recognition software to retail stores and other clients and found that these companies view masks not as a dastardly obstacle to overcome but rather as a potential marketing opportunity, if they can figure out how to identify you through one.

According to CNET's report, Facebook was already working on identifying people wearing hats and glasses before the pandemic. One of the groups keenest on this technology is retail store owners looking to identify repeat shoplifters.

Meanwhile, UK-based Facewatch is now actively promoting its mask-proof capabilities with a new algorithm that it claims can handle detection and identification based on just a person's eyes and eyebrow region. Another company, SAFR, sells its technology into schools and has been similarly quick to adapt to the changing landscape by adding masked faces to its training dynamic.

Seeing through masks — What's concerning here is that facial recognition technology is already known to be unreliable without masks, and that's when it has all kinds of data points about your face. SAFR told CNET that the accuracy rate of its facial recognition drops to 93.5 percent when people are wearing masks — but only under ideal conditions, "such as when the subjects are depicted in a high-quality photo." And that accuracy rate hasn't been validated by outsiders; it's just the one SAFR touts to clients.

Critics of facial recognition are reasonably skeptical that service providers can identify people from just their eyes and eyebrows. It's theoretically possible, but experts say that without data on your nose and mouth, a camera would have to get a perfect shot of your eye region and compare that against another perfect shot stored in the provider's database.

It's unrealistic in real-world scenarios where harsh sunlight and other factors might obfuscate your eyes. Facial recognition software can typically use grainy photos if they include your whole face because the algorithm can compare all the distinct shapes of your facial features to get a match. But that could change. Camera tech is constantly improving, and sensor makers like Sony are now building facial recognition right into their sensors.

Exacerbating racial bias — Facial recognition software is frequently lambasted for putting people of color at the greatest risk by misidentifying them the most. Even when many of the subjects were caucasian, a test of Amazon's Rekognition tool by the ACLU ending up misclassifying 28 members of Congress. Such mistakes could lead to innocent people being targeted and treated differently when they visit Urban Outfitters, say, because the software wrongly identifies them as a previous shoplifter, for instance.

Police agencies and the FBI have already used facial recognition software and driver's license databases to successfully identify suspects. But lawmakers have warned that there's little transparency around its accuracy and few rules around its use. Right now police can effectively match your face in a database and label you a suspect without any warrants. And adding body cams to the mix only makes the opportunities for false positives and abuse more acute.

Bias infiltrates algorithms — Even if an algorithm itself works as intended, the way it's implemented can still be biased. ProPublica found through a famous investigation that judges around the U.S. were determining sentencing based on an algorithm's prediction of recidivism, or the likelihood the defendant would commit another crime. The algorithm was biased against minorities because it was programmed to use factors like crime rates in a defendant's neighborhood to make its determination. Those from "bad" neighborhoods had the cards stacked against them from the start.

African Americans are incarcerated in the U.S. at a rate five times higher than caucasian. We should be worried about poor facial recognition software, made even less accurate by masks, being activated in minority communities that are already disproportionately suffering from the COVID-19 epidemic.