Science

Halloween Is Also Anti-Facial Recognition Software Day

The ghost in the system is a sheet covering your face.

Flickr.com/Daniel Go

Facial recognition software has improved tremendously over the past three decades. Yet, as technically accomplished as facial recognition has become, a rubberized Mark Ruffalo-as-Hulk mask is still enough to trick any electric eye. Halloween, it turns out, is a vacation from facial recognition.

Algorithms are good at identifying faces looking straight at a camera, which is why Facebook, for instance, tends to have a fairly good sense of who’s next to you in your selfie. But once you lose the eyes, the jig is up for many systems. Discussing the limitations of the FBI’s facial recognition software in the aftermath of the 2013 Boston Marathon bombing, for instance, Michigan State University biometric scientist Anil Jain pointed out that the software’s ability to identify the bombers had everything to do with them failing to face the security camera head (and face) on.

URME/Indiegogo.com

Or, if you really want to get weird with it, forget that knockoff V for Vendetta mask. Chicago-based artist Leonardo Selvaggio created the URME project in 2014, which is a variety of surveillance fooling masks centered on a theme: Selvaggio’s face, as either a rubber prosthetic or a paper cutout version for the budget-minded.

“Spoofing is a big issue for FR systems and so many people are trying to solve the ‘liveness detection’ problem, whereby computer vision discriminates between real and artificial faces,” David White, a facial recognition expert and psychologist at University of New South Wales, Australia, tells Inverse. “Some of the latex masks and even more rough and ready versions such as faces printed on cardboard are pretty good at fooling the systems.”

What might solve this issue is a hybrid computer-person system. In White’s recent PLOS study of facial detection, professional facial examiners using automatic face recognition software were able to match adult faces to their passport photos about 70 percent of the time. (These individuals weren’t in disguise but needed to be selected from one of eight passport photos; an algorithm churned through a database of Australian passport images and spat out the correct photo, in a set of eight, 90 percent of the time. Presumably, a mask would set off even more alarm bells for a human monitor.) Conversely, for 42 untrained students, the detection rate hovered at about 50 percent.

White et al. PLOS

A computer alone, however? Covered-up faces are why the FBI is looking into identification via tattoos. So, yeah, Halloween doesn’t mean anonymity if you’re rocking some sleeves, but otherwise you’re free to get on with the anonymous haunting.

This post has been updated to include comments from David White.