Facebook is in the midst of its most tangled and depressive PR crisis in years, following the leak of internal research showing the company is more than aware of its toxic effects on teenagers. The person who leaked all that research revealed herself this weekend in an interview with 60 Minutes. And, as you might expect, she has a lot to say about Facebook’s business trajectory.
Frances Haugen joined Facebook in 2019 with the express desire of fixing the company from the inside. She’d watched friends have their lives taken over by conspiracy theories and misinformation, she said, and told recruiting managers the only way she’d be interested in working at Facebook would be to fight directly against that tide. Haugen left Facebook in April of this year because, as she tells it, she realized the problems she’d joined the company to solve were unfixable. The 200-person Civic Integrity team she worked on was fighting a losing battle, an untamable forest fire. So she collected as many internal documents as possible — thousands, it turns out — and sent in her resignation letter.
Hate for profit — As Haugen sees it, Facebook’s most pressing problem is that it focuses so intensely on driving traffic and, therefore, profits, that it sacrifices safety in the process. She tells 60 Minutes host Scott Pelley that the company has been fighting a war within itself: between what’s good for the company and what’s good for the public. And she says that “Facebook chose over and over again to optimize for its own interests.”
This sentiment hits particularly hard because watchdogs and experts have been sounding the alarm on Facebook’s trade of hate for profits consistently for years now. Hearing it direct from the source — and one tasked with fixing that very problem — is damning.
About the algorithms — Haugen links Facebook’s hate problem back to new algorithms the company introduced in 2018 to amplify content users might like to see on their feeds. The algorithms are meant to drive engagement, keeping users happy and occupied. But hate ended up being what users wanted — or at least enough did that the algorithms turned into a megaphone.
Documents disclosed by Haugen prove not only that this is a problem at Facebook but one the company knows about intimately. The company’s internal research found that “misinformation, toxicity, and violent content are inordinately prevalent” in content shared by users and amplified by Facebook algorithms.
Damage control — Facebook, meanwhile, is on full damage control duty. Facebook’s director of policy communications, Lena Pietsch, sent in an extensive series of statements to 60 Minutes in response to Haugen’s interview. Each statement is (as you might expect) a redirection or straight-up denial of Haugen’s sentiments. Here’s Pietsch on Facebook’s hate profits, for example: "Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business. Our incentive is to provide a safe, positive experience for the billions of people who use Facebook. That's why we've invested so heavily in safety and security."
Nick Clegg, Facebook’s VP of global affairs, even went as far as to appear on CNN’s “Reliable Sources” show to defend Facebook ahead of Haugen’s interview. He called her assertions “ludicrous” and claimed blaming such large issues on Facebook gives people “false comfort.”
Haugen left the company with her treasure trove of research files in the hopes that leaking them would spur legislation and oversight from those in power. She will testify in front of the Senate tomorrow to make her case clear.