YouTube is being investigated by two U.S. advocacy groups for allowing videos touting fake COVID-19 cures and vaccines, TechCrunch reports. Between them, the two groups found “dozens” of these videos in the course of an 18-day investigation in March.
This is not the first time YouTube has been the subject of scrutiny around misinformation being spread on the popular video platform. It surely won’t be the last, either.
But YouTube says it’s cracking down on coronavirus misinformation. If that’s really the case, the discovery of these videos points to a much larger issue at YouTube: an inability to moderate its massive video library. That could have dire consequences for its community and, potentially, for those outside it, too.
Dangerous, full-fledged operations — The full report, which was compiled by researchers at the Digital Citizens Alliance and the Coalition for a Safer Web, is staggering in its breadth. While some of the purported “cures” are easy to spot as fake — one uploader uses the moniker “Real ID Card Fake Passport Producer” — others are more convincing.
For example, one video found by the researchers claimed to have N95 surgical masks for sale, a product that’s in particularly high demand right now. The researchers contacted PharmaChem, the supposed company that uploaded the video. They held a full conversation with the company on WhatsApp and were made to believe they could purchase masks in bulk.
An internet-savvy user might recognize warning signs and verify this seller before making a purchase. But many browsing YouTube wouldn’t. Any scam is dangerous, but these are especially so, given their claims of medical use.
Misleading linking — Like many social media platforms, one of YouTube’s measures of preventing misinformation is linking users to factual information — in this case, a link to the CDC’s website is automatically added to any video related to COVID-19. Researchers found that the link is even added to videos containing misinformation. They point out that the presence of this official link can actually make the fake video appear more legitimate to viewers.
YouTube says it’s on it — After being slammed by multiple U.S. senators last month for its mistreatment of false COVID-19 information, Google took new measures to combat the misinformation across its platforms. In YouTube’s case, the measures took the form of an automated review process, because many of the platform's human moderators have been forced to self-isolate.
This isn’t enough — This auto-moderation process seems to not be enough. When YouTube announced it would be switching to auto-moderation, the company said it expects more videos to be removed than if humans were at the helm. Researchers found upwards of 60 videos that had been missed. Surely there are more even than that — and even one video is enough to cause serious harm. By publicizing their findings, the researchers hope Google and YouTube will seriously consider revamping their moderation practices.
YouTube has come under fire for its lack of moderation skills many times. Earlier this year, Congress pleaded with Google to remove climate change misinformation circulating on the platform. And while some reports show YouTube has been somewhat successful in removing misinformation, it’s increasingly obvious that the company is still struggling with its scrubbing efforts. In the case of coronavirus misinformation, that could be deadly.