The conspiracy movement QAnon, which claims a "Deep State" is doing everything it can to take down President Trump, continues to thrive on Facebook despite a crackdown last month. According to The New York Times, growth in follower counts for related pages has slowed but not stopped, and engagement is higher than before the enforcement action.
None of this should be surprising. This is how it often goes with Facebook, a failure to quickly and consistently enforce its rules at scale because of either incompetence or an unwillingness to do so.
Cat and mouse — These types of groups are dangerous because the conspiracy theories have been increasingly linked to real-world violence. Facebook deleted more than 800 QAnon-affiliated groups in a sweeping crackdown on groups that "suggest violence," but since the company relies heavily on automation, individuals have been able to evade detection by modifying the names of their groups in various ways. Some have replaced 'Q' in group names with the number 17, for example, because Q is the 17th letter in the alphabet.
Algorithms are rigid and unable to pick up on nuance, so motivated individuals who believe in QAnon are playing a cat-and-mouse game to identify how their groups are caught and make appropriate changes to hide. Some groups that are supposed to be about innocent topics like yoga have been repurposed as QAnon groups, and members will rephrase their posts to appear like tame discussions of child sex trafficking. One common claim made by QAnon acolytes is that elites are trying to take down Trump because they run sex trafficking rings and are worried he'll unmask them.
Failure to act — It's not just that the algorithms are failing. Critics say leadership at Facebook consistently fails to enforce its rules in a timely manner, and has invariably been unwilling to remove groups associated with extremist movements when alerted because doing so might trigger cries that it's censoring conservatives. The #StopHateForProfit campaign saw large corporations temporarily cut advertising spend to Facebook in order to push for more action on violent or false content. Since then, CEO Mark Zuckerberg has defended posts from President Trump despite an employee protest, and followed that up by telling staff that he doesn't want them debating social and political issues at work.
Twitter has been much more proactive in removing false and inflammatory content from its platform. It also stopped selling political ads whereas Facebook says it will only stop selling them a week before election day.
Hedging bets — Zuckerberg says that he wants Facebook to be a bastion for free speech, but it's possible he uses that as cover to allow violent or suggestive content to slip through. It's been shown in past instances when right-wing pages received multiple strikes for false content, executives allowed the pages to remain live because they spent significant money on advertisements.
The company could act more aggressively against dangerous content in line with its stated values — ironically such content actually suppresses speech. But there's little incentive to do so considering Facebook's business is doing incredibly well despite the pandemic. It might say that it would be costly to scale up its enforcement teams more, but it created this problem in the first place by building a platform reliant on becoming the town square. And with the slight possibility that Trump will be reelected, Zuckerberg might be making a cynical bet to wait out the criticism in exchange for safety on the other side.