Earlier this month, Facebook drew ire from the academic community for banning a group of researchers from NYU who’d been looking into how political ads are served on the social network. Now another, similar project — this one from researchers at Princeton University — has been dismantled before it even really got its start.
Princeton’s Center for Information Technology Policy has been waiting on the go-ahead from Facebook to begin research for months now. This week the researchers helming the project ditched the project entirely, citing Facebook’s regulations and contractual requirements being far too rigid and intense.
Facebook says it very much welcomes academic research — but very little third-party research is actually being done, thanks in no small part to Facebook’s complex policies around what data can or cannot be used by third parties. At this rate, it’s pretty clear that Facebook’s allowance of third-party research will only ever be on its own terms.
Facebook gets first dibs — Princeton’s reasons for discontinuing its study will sound very familiar to anyone who’s followed the NYU case. Orestis Papakyriakopoulos, one of the Princeton researchers, told Digiday the main sticking point was Facebook’s insistence that the company gets access to any data collected before it’s handed over to the team.
Facebook requires all researchers to sign up for access to data through the Facebook Open Research and Transparency (FORT) platform. As part of this contract, researchers must give Facebook the right to review any draft publications before they’re released into the world. Nothing strange or authoritarian about that, right?
Privacy or excuses? — Facebook claims this review is a proactive measure to protect user privacy.
“The questions these researchers ask and conclusions they draw are not restricted by Facebook,” a company spokesperson said. “We simply ask academics to sign a research data agreement to ensure no personal data or confidential information is shared.”
Facebook’s working definition of “personal data” has, in practice, been stretched a bit thin. In the case of the NYU research team, Facebook said the group had been collecting personal data — when in fact the researchers had only been collecting data on advertisers, not on Facebook users.
Facebook has also contended in the past that its FTC rules — imposed after the Cambridge Analytica scandal — that force it to take such an intense approach to research privacy. But the FTC actually wrote a letter to CEO Mark Zuckerberg after the NYU case stating that its agreement “does not bar Facebook from creating exceptions for good-faith research in the public interest.”
Whether or not Facebook is acting in good faith here and actually trying to protect its users is somewhat moot — what matters is that, for researchers, the company’s strict policies appear very much like deliberate obfuscation and evasion.