YouTube has announced its policy to protect the upcoming 2020 election. The company says in particular that it will delete any videos that encourage people to interfere in the election, or any videos containing election-related information obtained through a hack, harkening back to the infamous Wikileaks dump from 2016.
YouTube already prohibits deceptive content that may mislead people on voting, but today's update on interference is new. Any video will be removed if it encourages viewers to "create long voting lines with the purpose of making it harder for others to vote," as one example provided by YouTube.
In regards to information obtained from hacks, YouTube is surely thinking back to Wikileaks' involvement in the last election. Back in 2016 when Hillary Clinton's internal campaign emails were leaked online, Democrats called fowl because emails from the Trump campaign weren't similarly exposed, giving him an unfair advantage. U.S. intelligence indicates that Russia is again interfering in the election to favor Trump.
Censorship cries — Some circles are sure to cry fowl about YouTube's announcement today, particularly quibbling over what constitutes an encouragement to interfere. It's also a gray area whether or not YouTube will be able to determine if certain information was definitively "hacked" or rather just leaked by an insider, and whether the latter would violate its policies. But a regular reminder is in order that YouTube is a private company and has the right to choose what is acceptable within its walls and what isn't.
Google was one of the major tech companies pilloried by conservatives during a recent Congressional antitrust hearing, with Republicans claiming the platforms unfairly censor them and favor Democrats.
Fighting a losing battle — YouTube says as a part of its efforts it will also elevate reputable channels in search results. Searching for a candidate will return an information panel with more information about them, and a link to their official channel if they have one. The company has been surfacing credible sources for certain types of content for a long time already, and it places fact checks in search results when certain terms are queried, such as "Does hydroxychloroquine cure coronavirus?"
YouTube has worked diligently to clean up misinformation on its platform that could pose a real harm, such as fake coronavirus cures and information claiming global warming is a hoax. At its scale, however, it still struggles and relies on users to report violating content – but by the time it removes such videos they could have easily racked up millions of views. It's likely that videos violating these new policies will slip through the cracks.
Critics say that it's long overdue that companies begin policing their platforms for dangerous or misleading content considering the reach and ability to spread content quickly through manipulation. They point to broadcast television which has always had higher standards for what's acceptable to air.