Culture

This AI privacy tool could be hell for facial recognition systems

Researchers say that the slightest, almost invisible changes to your face could throw facial recognition systems off.

Facial recognition security system. Face augment mobile phone technology concept. Woman portrait loo...
Shutterstock

Have you ever thought of cloaking your photos to throw facial recognition systems on their heads? That's what the creators of Fawkes are trying to do with their pro-privacy artificial intelligence tool.

Given the ubiquity of facial recognition systems — including shadier ones like Clearview AI, which reportedly boasts some deep ties with law enforcement and the alt-right — it seems the need for a product like Fawkes has never been greater. If it can thwart the creepy photo crawlers and collectors, count us in.

Fawkes doesn't give you digital facial surgery, but simply makes tiny little changes to your photo. Its creators call it a preventative measure against widespread facial recognition and the myriad of privacy problems it poses.

The Achilles heels of DNN — Facial recognition systems rely on deep neural networks (DNNs) in order to execute accurate classification of data. According to Fawkes' creators, the tool targets DNNs' weak spot by making slight and nearly invisible changes in input. These little tweaks end up impacting DNNs' ability to classify this input.

Of course, DNNs can learn over time to eliminate this issue but its initial success comes "precisely because of fundamental weaknesses in how DNNs are designed today." Emphasis theirs.

What would it work against? — If the researchers behind Fawkes and its supporters want it to truly thwart facial recognition systems, they are going to have to find a scaling solution. Right now, it's a game of numbers as the number of tweaked photos is naturally lower than original images present in various databases. So, if Fawkes is going to be a conventional approach to boosting privacy, it has to outweigh the number of original photos in these facial recognition systems.

"Please do remember that this is a research effort first and foremost, and while we are trying hard to produce something useful for privacy-aware Internet users at large," the company wrote, "there are likely issues in configuration, usability in the tool itself, and it may not work against all models for all images."

Something like Fawkes could be integrated into social media networks, even if it sounds ironic considering the many data privacy issues with these platforms. At the moment, Fawkes works against Microsoft's Azure which the company tried selling to the Drug Enforcement Administration, Amazon's ethically compromised Rekognition, and Face++. Fawkes' researchers have a helpful reminder worth taking note of. In the meantime, you can try experimenting with Fawkes yourself.