Tech

How extremist YouTube channels are covering their tracks to avoid bans

By leaving videos up only temporarily, conspiracy theorists rack up views and ad dollars before YouTube catches on.

Caucasian female hand holding up a metal capital Q
Adrienne Bresnahan/Moment/Getty Images

Conspiracy theorists are using a new tactic to grow audiences on YouTube without facing punishment for spreading false and inflammatory information.

According to CNET, a network of more than 40 channels linked to QAnon have been uploading videos only to delete them days later — long enough to rack up tens of thousands of views, and earn money from advertisements, but not long enough for YouTube to catch the offending videos and punish the channels.

QAnon is a movement claiming that elites, including both Democrats and Republicans, are trafficking children and have organized to take down Trump before he can reveal their secret operation.

Platform problem — Followers of QAnon and other conspiracy theories have been pushed to the fringes as major platforms have aggressively targeted them over links to violent confrontations around the recent presidential election. But the new tactic provides a further reminder that major social media platforms are locked in an imperfect game of whack-a-mole to catch offenders.

YouTube relies on a combination of algorithmic detection and human moderators to review videos for policy violations. With more than 500 hours of video uploaded every minute, it can take the company days to review and take action against a video.

This tactic of removing videos before they’re flagged means that conspiracy channels can develop large audiences that watch their videos before they’re removed. Even though they lose some ad money by removing the videos, it makes more sense to remove them after a short time than risk losing advertising privileges or being banned from the platform and needing to create a new channel from scratch.

Some of the videos associated with the channels netted anywhere from 8,000 to 150,000 views before they were removed.

Human > machine — YouTube says it has removed the channels identified by CNET, all of which were seemingly operated by a coordinated network as they uploaded the same videos and had identical biographies. The channels in the network are possibly based in Vietnam, meaning the owners are exploiting the U.S. political environment to make some quick cash.

It’s become a recurring theme for major platforms to tout their success removing violative content, only for journalists and others to bring more of the content to their attention. YouTube says towards the end of 2020, just 0.16-0.18 percent of views on its platform were to videos that broke its rules. But the company conveniently doesn’t provide hard numbers. And the numbers don’t include videos YouTube missed, or ones removed before they were flagged. It’s hard to quantify just how well YouTube is actually doing against its goals.

YouTube accounts for 11 percent of Alphabet’s (parent company of Google) revenue. The whole business model is based on allowing anyone to upload videos, and automatically placing ads against videos. It’s just the reality that these problems will continue so long as computers lack the intelligence of humans. Or, until YouTube entirely changes its model as an open platform.