Culture

Wikipedia plans to shut down election misinformation in real-time

The company's dedicated task force is ready for any possible situation on election day and beyond. Or it hopes it is, at least.

JEFF KOWALSKY/AFP/Getty Images

As the U.S. Presidential Election draws near, every corner of the internet is making preparations for the worst-case scenarios. Wikipedia’s is more intense than most: the site has created a dedicated task force for fighting election misinformation in real-time.

The site’s new Internal (Anti) Disinformation Task Force is comprised of representatives from the company’s security, product, legal, trust and safety, and communications teams. That task force has developed a “playbook” with tons of possible scenarios for how election day will go down — a bare minimum effort that we’ve seen now at other companies like Facebook and Twitter.

Wikipedia is taking its precautions one step further by putting its dedicated team on the frontlines during and after the election. The site also plans to lock down important pages so they can only be edited by trusted sources.

Transparency rules — Wikimedia, the nonprofit that hosts Wikipedia, wrote an extensive report about its plans for the election, which it released on its website late last week. It’s clear from the report that Wikipedia is taking the threat of misinformation very seriously indeed.

The company’s task force spent many long hours figuring out the best ways to prepare for coordinated attacks. The official 2020 U.S. Election article, for example, will only be open to editing from users with more than 500 edits completed and accounts older than 30 days.

Even more refreshing than the company’s plan itself is its transparency in disclosing those plans to the public. This openness stands in stark contrast to messaging by social media companies like Facebook, which has mentioned designated kill-switches and special internal tools but only in the most abstract sense.

Will it be enough? — You know what: that's a really great question. There’s honestly no way to tell how election day — and the months following it — will play out, on the internet or otherwise. That said, Wikipedia’s approach to the task ahead comes across as clear-headed and prepared.

It’s helpful, at least, that Wikipedia’s election misinformation team is comprised of humans. Facebook has been forced to switch many of its moderation efforts to AI as of late — with fairly disastrous results. It's also promising that Wikipedia has put as many resources as it has toward this effort, given the company's notorious lack of funding.

Wikipedia is at this point seen as a highly trusted source of information. It only makes sense that the site would take the threat of misinformation seriously. That’s good for both the public and for the site’s reputation moving forward. Now if only better-funded platforms did as much to combat malicious actors or misinformation.