YouTube has allowed advertisements to run on copies of a previously-deleted video peddling dangerous misinformation about coronavirus. Media Matters identified dozens of reuploads of the video which are still live and have racked up millions of views collectively. More than a dozen carry advertisements, including one that has more than 1.7 million views.
The video in question was created by osteopath Dr. Rashid Buttar, and in it he makes a series of wild claims, including that the current pandemic was created by Bill Gates as a "false flag event to then mandate mandatory vaccines" that would be purchased from the billionaire himself, therefore benefiting him financially.
YouTube removed the original video a week after it was uploaded for discouraging people to seek medical treatment. The company said earlier this month that it is working hard to delete dangerous videos surrounding coronavirus "when they are flagged."
Pulling videos matters more than suppressing them — That last detail is important to note. YouTube struggles to identify this type of misinformation using automated systems because it's so nuanced, so the company relies on users to manually report the videos for review.
YouTube has also been promoting authoritative sources like the World Health Organization by ranking them higher in search results, among other ways, but as this situation highlights, bad videos can still reach many people before YouTube has a chance to pull them down. Even if they're not getting distribution on YouTube itself, the links can spread through other websites. Pulling them down altogether is important.
Proper moderation doesn't come cheap — As Media Matters notes, since many of the videos are monetized, YouTube is making some money off them because the company takes a cut of all ad revenue. That's not to say YouTube is acting in an insidious manner here. We've seen costs rise for all the tech giants as they invest heavily in tools to combat misinformation and other issues related to their massive platforms that are hard to police. Facebook CEO Mark Zuckerberg has previously warned that addressing content moderation issues would be costly, and that was reflected in its earnings report late last year when profit growth shrunk dramatically.
Of course, these problems are happening in the first place because of the nature of open platforms and the companies should be spending more money on moderation. It's unclear how much YouTube spends on moderation because Google doesn't break out its costs, but the company has, like Facebook, also promised to invest heavily in human teams.
Zuckerberg and other tech leaders are hoping scalable AI will be ready to handle these problems soon, but it's not there yet. That social network has also struggled with coronavirus misinformation, approving advertisements that promote false cures and other dangerous advice.