We know very little about how much private data has been handed over to law enforcement by tech companies like Google and Facebook, and that’s partly by design. After all, some criminal cases would fall apart if we knew too much about how their data sharing works and could game the system.
But just how little do we really know? Google received 27,850 requests for data involving 57,392 of their user’s accounts in 2016. That same year, Microsoft was subjected to 9,907 requests directed at 24,288 accounts. These numbers are alarming, but they also amount to nearly all that the public knows about how the U.S. government currently deploys its power to request data under the Electronic Communications Privacy Act (ECPA). In fact, these aren’t even government figures; they come directly from Google and Microsoft’s own voluntary transparency reports.
“It’s completely reasonable for government officials to want some level of secrecy, so that they can perform their duties without fear of interference from those who are under investigation,” Jonathan Frankle, a researcher at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) said in a statement. “But that secrecy can’t be permanent … People have a right to know if their personal data has been accessed, and at a higher level, we as a public have the right to know how much surveillance is going on.”
Frankle and others at CSAIL are hoping to find a working solution to this problem, one built around the same cryptographic keys and ledgers used to authenticate encrypted emails and bitcoin transactions. The system they’ve developed, called AUDIT (for the “Accountability of Unreleased Data for Improved Transparency”), will be presented at the USENIX Security conference in Baltimore next week.
Here’s how it works: When a judge issues a secret court order, or some police investigators ask a tech company for data, that action is combined with a series of publicly available cryptographic notifications, similar to the public PGP keys that allow individuals to send encrypted emails to one another. This “cryptographic commitment” is mathematically tied to the court action taken, and later to the data given by the tech companies to the government agency. The result is that, eventually, when these secret court records are made public, they can be checked against the cryptographic ledger to confirm that these secret Justice Department activities are aboveboard and the officials involved were doing what they said they were doing.
AUDIT has another major benefit too. The constant uploading of actions into the public ledger of cryptographic commitments will allow watchdog groups to pull politically important statistical information out of the court system about how the judicial system and law enforcement are using private user data. For example, which judges are issuing the most orders under ECPA? What types of criminal investigations prompt the most of these court orders, and from what companies?
The hope, as Frankle explained it to MIT News, is to generate credible transparency reports from the U.S. court system, comparable to the tech industry’s own, without compromising important criminal cases in progress.
Stephen William Smith, a federal magistrate judge for the Southern District of Texas who has written about the ECPA docket for the Harvard Law & Policy Review, shares Frankle’s expectations for what AUDIT might be able to accomplish.
“My hope is that, once this proof of concept becomes reality, court administrators will embrace the possibility of enhancing public oversight while preserving necessary secrecy,” Smith said in a statement. “Lessons learned here will undoubtedly smooth the way towards greater accountability for a broader class of secret information processes, which are a hallmark of our digital age.”