In findings that likely will not shock many readers, an analysis from The Washington Post strongly suggests that Facebook — alongside Twitter — inadvertently enabled the conspiratorial Goliath both seek to subdue today: QAnon.
The QAnon movement has proven to be a major headache for both networks, especially Facebook. The group positions itself as "truth seekers" who receive information from an anonymous leader named "Q," and its followers have dabbled in conspiracy theories such as the world being run by a secret satanic society which cannibalizes infants, West Coast wildfires supposedly ignited by Antifa, and how there is a supposed 16-year plan to upend the United States as we know it. They name and call for physical violence against politicians like Barack Obama and Hillary Clinton while their passionate support remains intact for Donald Trump. In fact, the president himself has refused to condemn QAnon theories.
It sounds unhinged; the United States Military Academy's Combating Terrorism Center has said that QAnon "represents a public security threat with the potential in the future to become a more impactful domestic terror threat." Yet despite these clear warnings since 2018 at least, Facebook chose a counterintuitive strategy: avoid cracking down on these posts, pages, and groups calling for violence in the name of "truth," and instead signal support for free speech to pacify conservatives who think there is a "bias" against them on social media. That ill-advised method haunts the biggest social network today.
What Facebook says — By shying away from taking a clear and definitive stance against QAnon, Facebook allowed the movement to flourish. In contrast, Reddit and YouTube did considerably more by removing QAnon posts and videos on their platforms. Facebook spokesman Andy Stone insisted to The Washington Post that the company exhausted all options to tackle the movement. "Removing hundreds of QAnon pages and groups, restricting the reach of many more, and soon prohibiting anyone from running ads that praise or support QAnon are not the actions of a company afraid of upsetting QAnon supporters," Stone explained. "It’s the important work we’ve done in consultation with outside experts."
Too little, too late — Despite removing 800 groups and banning 300 QAnon-related hashtags this summer, Facebook's systemic change against the movement proved to be stunningly insufficient. And too late. Instead of opting for early intervention, Facebook allowed QAnon to use its infrastructure, including its recommendation engine that would promote QAnon content to previously unexposed users. The rest is history. The conspiracy group became so popular that QAnon merchandise took off. Dozens of Republican candidates now show support for the movement.
None of this will shock critics who have implored Facebook to reevaluate its position of treating violent conspiracy posts as a protected class of free speech. Many called it several years ago and noted that Facebook will inevitably have to play a dangerous and abysmal game of catch up with a movement that openly calls for killing its opponents. There's a saying for that: you reap what you sow.