Facebook’s post-election mitigation plan is reactionary and dangerous

Facebook has plans in place to slow the spread of misinformation or calls for violence after the election — they're only for if things get really bad.

Justin Sullivan/Getty Images News/Getty Images

Facebook is planning for what’s sure to be a chaotic post-election period with special internal tools designed for “at-risk” countries, people familiar with the matter have said. The Intelligencer reports that Facebook hopes these tools will curb the spread of any viral content related to the election and stop inflammatory posts from taking hold of the network.

While Facebook has yet to confirm specifics about the internal tools, the company has used similar ones in the past in countries like Sri Lanka and Myanmar. Executives said they hope to only deploy the tools in the United States in “dire circumstances.”

By all accounts thus far, Facebook’s mitigation strategy around this year’s presidential election has been the same as ever: reactionary, rather than preventative, and with huge room for error. As usual, it would be nothing short of a miracle for that strategy to actually work the way Facebook envisions it working.

Years of building — According to Facebook spokesman Andy Stone, the company has spent years building safer elections. “We’ve applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”

The outcome of most of those scenarios would be Facebook taking more control than usual over what shows up on users’ feeds. We’re not talking just slapping an easy-to-miss label over someone’s post calling for shooting protestors. Depending how bad things get, Facebook might even change its algorithms to be less lax.

Toeing the line isn’t enough — Facebook’s typical laissez-faire policies have faced harsh criticism this year, as hate speech and dangerous misinformation spread across the company’s platforms with efficiency. That’s forced CEO Mark Zuckerberg to take measures that seem dramatic in comparison to his usual free speech talking points — like banning political ads from election day onward.

This is Facebook’s new favorite tune: we’ll take action, but only once a problem has escalated to already being really bad, like life-destroying bad. This type of strategy all but ensures damage will happen before Facebook steps in. It’s a way for the company to continue being hands-off — pleasing users critical of its removal actions — while swooping in as a hero once the damage has already been done.

We've already seen this strategy produce disastrous results. While attempting to curb the spread of QAnon across its platforms after the fact, Facebook actually helped the conspiracy theory's infestation. Or how about Facebook's decision to ban anti-vaccine ads after being the reason they existed in the first place?

Zuckerberg told Axios last month: “We need to be doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election.” Apparently “everything” only includes helping out after things have degraded into chaos.