Culture

The FBI's Facial Recognition Technologies Disrespect Your Privacy and Rights

A government report exposes flaws in the FBI's facial recognition systems.

Kai Oberhäuser

Who’s the watchdog’s watchdog? The Government Accountability Office, as its title suggests. The GAO’s recent report on the government’s use of facial recognition tech found its way to the public on Wednesday. Essentially, we got ourselves a situation. A bad situation. In short: With its facial recognition programs, the FBI is violating your privacy and not telling you about it. Plus, the FBI does not test to ensure that these programs are accurate and reliable.

The FBI and biometrics go way back — the loving relationship began with fingerprint databases. But it goes much further than that, now. “Biometrics,” to the FBI and according to the GAO, means “the automated recognition of individuals based on their biological and behavioral characteristics.” In 2011, the FBI rolled out the pilot of its “Next-Generation Identification” (NGI) system; four years later, in April, 2015, it was fully operational.

The NGI in part uses the Interstate Photo System (IPS) for these so-called biometrics. This database — in addition to external databases from private companies — provides government agencies with access to over 411 million face photos (the current U.S. population is only 319 million). The Department of Justice claims that most of these photos are from mug shots — “Over 80 percent of the photos in NGI-IPS are criminal” — but they also team up with numerous states to gain access to driver’s licenses and passport photos. The GAO says that the moment a suspect is “booked for a crime,” he or she is in the books — i.e. in the NGI-IPS.

Government Accountability Office

What’s not exactly clear from the report is how much the FBI diddles with your social media photos. The FBI claims it doesn’t include them in the NGI-IPS, but it does use them:

“…photos taken from security cameras or social media photos are not enrolled into NGI-IPS… According to the FBI, the external photo databases do not contain privately obtained photos or photos from social media, and the FBI does not maintain these photos; it only searches against them.”

But it’s worth noting just how much of a goldmine social media could be. Your computer’s photo app has the ability to recognize you and your friends. Facebook does, too, and it’s disturbingly good at cataloguing each person in a photo. Each time you use a Snapchat lens, Snapchat maps your face’s unique contours. And it doesn’t stop there — these technologies are practically everywhere.

Regardless, it’s not only the FBI having fun with your face: the access extends down to local police departments. (There’s also this: “The Department of Defense’s face recognition system is used to support warfighters in the field to identify enemy combatants.”) The FBI is the only agency with direct access, but everyone else can simply request access. Once access is granted, the agency in question sends a photo to the FBI and requests a specific number of potential matches. (The photo may be high resolution, but, knowing security cameras, it’s likely not.) Once there, the NGI-IPS runs the photo through its database, and then, the GAO says, “human analysis must be performed.” A team of “29 trained biometric images specialists in FACE [Facial Analysis Comparison Evaluation] Services” then gives the final word. At last, the requisite number of potential matches finds its way back to the agency. It’s then up to that agency to determine whether or not these apparent suspects are actual suspects.

Here’s how it works:

“Specifically, the technology extracts features from the faces and puts them into a format—often referred to as a faceprint—that can be used for verification, among other things. Once the faceprint has been created, the technology can use a face recognition algorithm to compare the faceprints against each other to produce a single score value that represents the degree of similarity between the two faces.”

Depending on your familiarity with science fiction, that may or may not seem like a reliable, trustworthy system. It’s not. Here’s a section header within the GAO’s report:

“FBI Has Limited Information on the Accuracy of its Face Recognition Technology Capabilities.”

All the FBI knows is that, when it did test the program’s 50-potential-matches accuracy rate — which it did, unnervingly, with a test database, and not the real database — it was 86 percent accurate. The FBI has “not assessed the accuracy of face recognition searches of NGI-IPS in its operational setting — the setting in which enrolled photos, rather than a test database of photos — are used to conduct a search for investigative leads.” Nor does it care how accurate its employed private companies’ facial recognition systems are: there are no audits.

Eighty-six percent accurate. In other words: about four out of five stars. If your local sushi restaurant had four out of five stars, you might consider ordering takeout. But just as a low number of reviews can skew a restaurant’s rating, so too a high number of potential matches can skew the NGI-IPS’s accuracy. Fifty potential matches is the maximum agencies can request; the system may spit out as few as two. The GAO says that the smaller the list, the less accurate the system.

There’s also the whole pesky false positive thing. “FBI officials stated that they have not assessed how often NGI-IPS face recognition searches erroneously match a person to the database (the false positive rate).” False positives are — pardon my language — serious fuck-ups, because they “can alter the traditional presumption of innocence in criminal cases by placing more of a burden on the defendant to show he is not who the system identifies him to be.” The system can break down on either end of the equation: if the photo is poor quality, success rates plummet; if the software is poor quality, success rates plummet. Low success rates mean many false positives.

The GAO, after completing its report, gave U.S. Attorney General Loretta Lynch and FBI Director James Comey six recommendations. The gist: the Department of Justice has an opacity problem. The public may know that its privacy is being violated, but the public doesn’t know specifically how its privacy is being violated. Also, the Department of Justice must remedy the lack of oversight to ensure that there are minimal privacy oversteps. Finally, the Department of Justice must makes sure — albeit retroactively — that these programs are reliable.

The Department of Justice only agreed, in full, with one recommendation. It “partially agreed” with two more, but rejected the remaining three — two of which attempted to establish accuracy guidelines.

Related Tags