Tech

Facebook knew it was divisive and resisted reform anyway

Executives going up to Zuckerberg himself have gradually given up on changes for social good.

Drew Angerer/Getty Images News/Getty Images

In a Wall Street Journal report, details from several internal presentations and documents at Facebook have come to light. Executives were made aware of how the platform’s recommendation algorithm encouraged divisiveness as early as 2016 and essentially shrugged off solutions. Initial efforts dwindled away due to fears of perceived anti-conservative bias or the potential to hamper the online presence of groups like the Girl Scouts of America, mostly at the whims of policy chief Joel Kaplan.

They knew just how responsible they were — A 2016 presentation linked Russian bots and Facebook’s recommendation algorithm to users joining Groups with extremist ideologies in Germany. The presentation found 64 percent of users in the data set joined due to prominent results in “Groups You Should Join” and “Discover.” WSJ sources confirmed the issue wasn’t limited to that country or Facebook Groups, but couldn't get an official comment from Facebook. The 2016 election and Cambridge Analytica scandal amplified these internal findings. Contrary to a shady public image lined with vague platitudes, Facebook was shook behind closed doors.

Chris Cox, former chief policy officer, spearheaded a task force in 2017 called “Common Ground” as well as the creation of “Integrity Teams” throughout the company. These researchers and engineers worked for about two years to determine pain points and potential solutions. Neutrality was a core value with a 2018 document determining these teams wouldn’t focus on changing users’ minds, but rather “increase empathy, understanding, and humanization of the ‘other side.’”

For intense arguments, the teams suggested allowing moderators to put those involved in a subgroup bubble to limit other members’ exposure. They also toyed with limiting the number of permitted replies, years ahead of Twitter’s new conversation settings. Integrity Teams focused on the newsfeed figured out that most bad behaviors came from a small number of people on the far ends of the political spectrum. Additionally, suppressing clickbait articles disproportionately affected conservative outlets which would go on to fuel anti-Facebook rhetoric from the right.

Is this dinner or policy? — Kaplan, President George W. Bush’s chief of staff, took a special interest in these teams once he felt conservatism on Facebook was threatened. His “Eat Your Veggies” process, which was established to limit engineers’ political leanings affecting products, devoured any hopes of reform. Carlos Gomez Uribe, who used to be responsible for your Netflix recommendation holes, was brought over by Cox to lead the newsfeed Integrity Team. Their policies were either tossed or watered down past any semblance of efficiency.

Efforts to suppress politics-specific clickbait and classify hyperpolarizing posts were crushed by Kaplan or negative team member reviews. To combat “super-sharers” who drown out average users, Uribe pushed “Sparing Sharing.” The policy would limit the influence of far-left and right users who engage with content in a bot-like manner and amplify more of the voices in between. Kaplan reportedly used a Girl Scout troop as an example of a temporary super-sharer.

By the time the policy debate got to CEO Mark Zuckerberg, he let it pass — with an 80 percent reduction to the weighting. He also requested to not be brought similar issues due to his declining interest in social good products. At this point, it seems as though Zuckerberg and his top brass are fully tapped out and willfully believe Facebook doesn’t have the power to radicalize its users.

But now, we know, without a doubt, they've known better for years.