Science

Could Outside Moderators Save Facebook From Fake News?

Someone has to step up to the plate.

by Joe Carmichael
A shadow looking through a window with the Facebook logo on it
Getty Images (Justin Sullivan) / Pexels / Photo Illustration

Like the majority of Americans, Mark Zuckerberg, creator of Facebook, is in the middle of an existential crisis after the 2016 election. Zuckerberg insists that Facebook is not a “media company,” despite the service’s obvious problem with curbing fake news and misinformation, which became so endemic in late 2016 that it may have contributed to Donald Trump’s win over Hillary Clinton. After a period of denial, Zuckerberg finally stepped up and acknowledged that the company had a problem with fake news. Finding a solution, Zuckerberg said, will not be easy — but Keith Bilous, a social media expert with years of experience moderating public forums says it may be time for the company to consider getting some outside help.

“When Facebook doesn’t stand up and say, ‘We’re a media company,’” and instead says “‘We’re a technology company that connects people,’ it’s difficult for them to really stand behind any editorial guidelines,” Bilous tells Inverse. He is founder and CEO of icuc.social, which he says is “one of the world’s largest social media management companies.” His company serves as a moderator for major brands and news organizations, including the New York Times and NPR. Bilous thinks Zuckerberg’s reluctance to admit that Facebook is a media company is the crux. “Until someone wants to take that responsibility, I don’t know if it ever gets fixed.”

When you gaze into the void... etc etc.

Getty Images / Dan Kitwood

“Identifying the ‘truth’ is complicated,” Zuckerberg wrote, four days after the election. “We must be extremely cautious about becoming arbiters of truth ourselves.” But if Facebook doesn’t want to assume responsibility for curbing misinformation itself, how can it address the problem? External solutions, like Daniel Sieradski’s Chrome extension that flags links from untrustworthy sites, depend upon users adopting them — which probably isn’t something the demographic most vulnerable to fake news will do. Facebook alone has the ability to change all users’ behavior, but to do so it would have to admit that it is a media company, and assume the responsibilities that come with it.

The moment Facebook does admit that it’s a media company, it can set editorial guidelines, which are far more effective than the current, run-of-the-mill, “common sense” guidelines that Facebook feels comfortable enforcing. (In essence, content must remain PG-13.) “The moment you say yes is the moment you have to draw lines around it,” Bilous says. “What is their editorial stance? Are they a right-wing media company, or are they a left-wing media company?”

When icuc.social helps big news organizations like the New York Times moderate comments, it looks to those new organizations’ particular editorial standards. If Facebook were it to admit that it is a media company, it would be able to do the same.

“Somebody would then have to say, ‘OK, here are our guidelines; here’s the kind of content we want and don’t want to allow on our site,’ and then they’d manage the content in relation to those sets of editorial guidelines,” Bilous explains. And if, additionally, it hired a public editor, its users would not be left in the dark when Facebook did make such decisions.

The social network’s misinformation dilemma intensified after the company laid off the human editors responsible for curating the site’s trending topics. The algorithms that took their place proved far more susceptible to picking up fake stories. A human editor wouldn’t make the same mistake — they could vet any trending story and filter out blatant lies, like the viral “Denver Guardian” story that alleged Hillary Clinton was responsible for the death of an FBI agent involved in her email scandal. But for all its effects on the media, Facebook still doesn’t think it should bear the same responsibilities toward its users or readers that any other ethical media company does.

The social network has the weight of the online world on its shoulders. 

Wikimedia Commons

Zuckerberg says that’s because his company’s primary focus isn’t news or media, but social networking. The Pew Research Center estimates that 62 percent of Americans use Facebook as a source of news.

“People share and read a lot of news on Facebook, so we feel a great responsibility to handle that as well as we can,” Zuckerberg commented on his post about the election on November 12. “But remember that Facebook is mostly about helping people stay connected with friends and family. News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance.”

But a few days later, Zuckerberg took another pass at the problem. In a post on November 19, Zuckerberg laid out several steps the company would be taking to tackle the problem of fake news. The solution, he wrote, would start with “better technical systems” for “stronger detection” of fake stories. It would also include cutting off fake sites’ ad revenue and, possibly, introducing third-party verification of stories and flagging of suspected false stories.

Bilous admits that whatever solution Facebook does pursue will be arduous. “It takes time to vet a news story. It’s a big task to investigate whether news is real or not, because even some of the most absurd news could be real,” he says. The task would be made easier if Zuckerberg fessed up to the fact that his social network has grown out of its shoes, but Bilous understands why he will never do so. “I don’t know if it’s in his best interest to do so, and I don’t know if it will ever be in his best interest to do so,” he says. For starters, media companies “are valued differently than technology companies.”

But unless Facebook somehow ceases to grow, this won’t be its last bad habit. It may get through fake news rehab, but it’s too popular to stay clean for long. “We never had this problem a few years ago, or five years ago, or ten years ago, because Facebook usage wasn’t what it is today,” Bilous says. “What’s going to be the problem that we’re talking about when Facebook’s serving up two billion people, instead of 1.6? Every 500 million users, there’s a new problem.”

Related Tags