LinkedIn's war against QAnon is far more effective than Facebook's

#WWG1WGA and other QAnon hashtags yield little to nothing on LinkedIn. Facebook could learn a lesson or two from it.

A QAnon supporter can be seen holding a Q for the movement.
Stephen Maturen/Getty Images News/Getty Images

LinkedIn isn't in the mood to play around with QAnon content. The conspiracy movement has increased its presence on the internet's professional networking service, according to The Wall Street Journal, but why LinkedIn — unlike Facebook — has decided to tackle the issue head-on. It's banning accounts, removing hashtags, issuing warnings, and taking proactive measures to stop the spread of the group's unfounded ramblings.

LinkedIn's style of intervention — Input ran its own search of QAnon accounts on LinkedIn and found a few LinkedIn members with account summaries showing support for #QAnon, #MAGA, #RedPilled, and the like, but the website yields zero results for the QAnon hashtag based on its motto: "Where we go one, we go all" or #WWG1WGA.

Still, there is work to be done. The results for pro-QAnon members lead to various individuals in different parts of the country, including Florida, New York, and the greater Los Angeles area, and those are just the ones we checked.

A LinkedIn spokesman told The Wall Street Journal, "QAnon misinformation is not tolerated on LinkedIn," but added that pro-QAnon users won't get banned unless they share false information on the network.

The rise of Q — For years now, QAnon conspiracy theories claiming secret and demonic cabal runs the world and seeks children and infants as sacrificial lambs to gain more influence and power in governments have been gaining in prominence. The movement initially gained attention for its Pizzagate conspiracy, a theory that major Democratic operatives were running a child trafficking ring out of a pizza parlor, in Washington, D.C., which was thoroughly debunked by fact-checkers.

Another theory popular among QAnon supporters is that there is a supposed 16-year plan to devastate the United States and make it spiral into social and political chaos. To that end, QAnon supporters have openly called for executing Barack Obama and Hillary Clinton on networks like Facebook and Twitter.

Facebook, learn from LinkedIn — LinkedIn's director of trust and safety, Paul Rockwell, told The Wall Street Journal that QAnon content has increased over the past few months on LinkedIn. But there's a straightforward explanation for that: with lockdowns in effect and more people staying home, users are spending more time online, and the current climate makes consuming conspiracy-laden content more likely. Well, even more so than usual (even Pinterest has these theories pinned on some boards). LinkedIn's strategy is to spot these posts and ban users who repeatedly violate the company's content policies.

It's not a perfect solution — QAnon members are still using their summary fields on their LinkedIn profiles to show support for the movement, plus you can find results for #WWG1WGA by deleting the # in front of it, but at least LinkedIn is trying to keep up. Early intervention helps limit the spread of misinformation. Facebook's proven this. It let the problem fester until it couldn't control it anymore. LinkedIn at least seems intend on learning from Facebook's mistakes.