Tech

YouTube is the latest platform to crack down on QAnon

The video site isn't banning QAnon directly, but rather it will remove any video that promotes the group's conspiracy theories.

Andrew Lichtenstein/Corbis News/Getty Images

YouTube is the latest platform to crack down on the QAnon conspiracy movement. The company is taking a different approach than Facebook did with its recent ban, however. Whereas that company is banning any group or page associating itself with QAnon, YouTube will more broadly remove any content "that targets an individual or group with conspiracy theories that have been used to justify real-world violence."

QAnon, if you don't already know, is a blanket conspiracy theory suggesting that a "Deep State" of government employees are working in concert to take down President Trump. The movement's acolytes have been linked to a slew of violent crimes.

Broad moves — After Facebook banned the QAnon movement outright, critics worried that members would evade the policy simply by rebranding under more innocuous names. What YouTube is saying in essence with its policy change is it will take down QAnon content no matter whether it's directly linked to the movement — promoting any conspiracy theory linked back to the movement will trigger a removal. The predecessor to QAnon was called Pizzagate, so YouTube is accepting that its nature might shift over time and the company needs to be able to adapt more quickly than it has in the past.

While Facebook gets much attention for allowing the QAnon movement to spread for so long without action, experts say YouTube did arguably as much damage. That's because the site's recommendations would direct visitors from innocuous political videos to progressively more conspiratorial and QAnon related videos.

YouTube, like Facebook, will have to figure out how to actually enforce the new policy. Algorithms can have a hard time detecting the nuances of human language, and much content on YouTube isn't reviewed until someone reports it — meaning it can rack up millions of views before being deleted. YouTube has struggled more recently to keep up with conspiracy theories regarding the coronavirus.

Mainstream reach — Banning QAnon from YouTube won't lead to its total demise. But the movement really festers on the 8Chan message board, which is an unmoderated outlet for extremism. QAnon's theories get the most distribution when they jump from there over to the mainstream, more user-friendly platforms like YouTube, Facebook, and Reddit. An older demographic dominates Facebook's userbase, and it's this demographic that is disproportionately more likely than others to share false stories. 8Chan is a very clunky and unintuitive site, so banishing the movement to that niche corner of the web could prove effective at limiting it to the most extreme tinfoil obsessives rather than the casual believers.

You have to wonder why YouTube and all the other platforms are only cracking down now, and not, you know, after a believer shot up a pizzeria convinced it was home to a child sex ring.

All of the major social media platforms have stepped up their enforcement measures in recent months, suggesting they may be preparing for a Biden administration interested in cracking down on misinformation. Biden has said he "hates Facebook" for allowing Trump to discredit mail-in voting, and will consider reigning in Section 230 regulations that protect platforms from liability over what their users post.