Tech

Facebook has finally banned QAnon across all its platforms

The company will remove all Pages, Groups, and Instagram accounts linked to followers of the fringe conspiracy movement.

Protestors wave flags representing the QAnon conspiracy movement.
Stephanie Keith/Getty Images News/Getty Images

Facebook says it will ban all Pages, Groups, and Instagram accounts linked to the QAnon conspiracy movement, according to NBC News, marking a big step up from its previous policy to ban them only when violence was discussed. That policy enacted in August led to the removal of 1,500 pages, groups, and profiles.

QAnon is a fringe conspiracy movement that claims, among other things, that Democrats are working to take down President Trump through a series of elaborate cover-ups.‌‌ For years, followers have claimed that Trump will eventually prevail and Democrats will be rounded up for execution.

The "threat of violence" standard — After Facebook's previous enforcement actions, The New York Times reported that new groups arose and found higher engagement than ever before. The groups evaded enforcement by masking their posts in innuendo and toning down discussions, never making direct threats of violence that would trigger action.

But adherents to the movement have taken it upon themselves to act in response to apparent indiscretions anyway. In 2018, for instance, a Nevada man used an ammunition-filled truck to block traffic on a major bridge. During his standoff, he held up a sign that read, "Release the OIG report," referencing a QAnon theory that the Office of Inspector General was hiding a report into former FBI Director James Comey.

By outright banning the movement, Facebook is leaving no room for interpretation of whether or not a particular piece of content could lead to violence. If it's associated with QAnon in any way, it will come down.

“Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon. We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks,” Facebook wrote in a press release. “Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”

Enforcing the ban — Facebook may have banned QAnon but it will still need to actually find affiliated accounts. Followers of the movement have obscured themselves by modifying group names, such as by replacing the letter 'Q' with the number 17, and by posting theories in groups unrelated to QAnon. At its scale, catching misinformation using automated and human moderation has always been a cat-and-mouse game for Facebook.

Experts have criticized Facebook for allowing dangerous theories to spread at an unprecedented scale, and say that it locks users into echo chambers by recommending similar pages to them. Its algorithms favor outrageous articles that are gaining traction, and by the time Facebook identifies and removes them from its network, they've often already been seen by potentially millions of people.

Facebook hopes someday its machine learning technology will be good enough to catch the bulk of harmful content before it's shared widely. But the company still has to make decisions regarding what content is acceptable in the first place, a position it doesn't like to be in. QAnon is but one of many examples of Facebook's unwillingness to act quickly for fear of being labeled as biased.

The QAnon movement can still find other places to live online, but Facebook being the largest social media platform in the world, the group will be stunted. Facebook's userbase has increasingly skewed older over the years, and people over the age of 65 are disproportionately more likely to share fake information than younger adults. QAnon theories often begin on fringe websites before making their way to Facebook where they get wide distribution among a broad audience.