On Wednesday, Axios reported on research shared by the German Marshall Fund of the United States stating that conspiracy theories about Oregon’s wildfires are still rampant on Facebook. This news arrives days after Facebook announced that it remove the theories which generally blame “Antifa,” Black Lives Matter, or "leftist extremists" for the devastating wildfires affecting Oregon. Facebook spokesperson Liz Bourgeois revealed in a tweet that the company is using automation to take down relevant posts.
Nothing’s changed — The German Marshall Fund's Karen Kornbluh told Axios that 33 “Re-Open” Groups on Facebook have become hotbeds for conspiracy theories. On September 12, when Facebook announced it would remove the fire conspiracies, 11 of the 33 Groups circulated false claims that antifascists started the fires. It also alleged the posts had started receiving warning labels and reduced visibility two days prior. The content is still live on the platform and researchers estimate that the number of times these posts have been viewed and engaged with hasn’t dipped in the past few days.
“There needs to be an approach focused on risk of widespread harm, as opposed to imminent harm,” Kornbluh told Axios. It probably wouldn’t hurt to get some actual people to assess the situation. The one-two punch of Casey Newton’s eye-opening report opening the dam on Facebook moderators’ working conditions and the pandemic forcing everyone into their homes without adequate moderation resources has the company relying more and more on AI.
Now, this algorithmic dependence is not only allowing the spread of disinformation, but it’s also taking away resources from Oregon’s law enforcement, first responders, and 911 dispatchers. While distracting law enforcement from local peaceful protests might be considered a net win, clogging 911 lines as fires ravage the area is unconscionable. Both the Portland division of the FBI and the Douglas County Sheriff’s Office have publicly stated these conspiracies are false while urging the public to not waste local resources on them.
The radicalizing power of Facebook Groups and the platform's recommendation algorithm are hardly fresh-faced culprits in the platform’s role in the dissemination of false information. But passing the buck to AI is the equivalent of a shrug emoji, especially when the stakes are so high.