After several months of denial and vague plans to make changes, Facebook CEO Mark Zuckerberg announced Thursday that the site would finally crack down hard on fake news, hiring outside fact checkers to determine the veracity of articles that go viral on the site.
“Today we’re making it easier to report hoaxes, and if many people report a story, then we’ll send it to third-party fact checking organizations,” Zuckerberg wrote in a post. “If the fact checkers agree a story is a hoax, you’ll see a flag on the story saying it has been disputed, and that story may be less likely to show up in News Feed.”
The first part of the crackdown rests on making fake news easier to report for everyday users. Adam Mosseri, the VP of News Feed at Facebook, writes that the company is “testing several ways to make it easier to report a hoax,” but it looks like it will be a standard option available when you click the upper right-hand corner of a post. When you do that, you’ll see this (on mobile).
If enough people flag a story, Facebook will send it to a group of independent fact checkers, which is probably the smartest way to handle evaluating news articles, rather than doing it in-house. If those fact checkers find out the story is bullshit, it will be flagged as disputed, and a link to the fact checkers’ “why this is bullshit” post will pop up underneath it. Here’s what that would look like:
As you can see, you’ll still be able to share disputed posts, but a popup will warn you that what you’re about to post is most likely some hot baloney. Facebook’s outside fact-checkers will all follow the International Fact-Checking Network fact-checkers’ code of principles, which, hopefully, will mean none of the toxic sludge on Facebook slips through the cracks.
But most of all, any article flagged as fake news will be cut off from Facebook’s advertising revenue stream. The platform already did this for several bogus sites, but this will institute the policy on a far wider scale.
Zuckerberg also said that part of the problem of fake news is just that people don’t read the articles.
“We’ve also found that if people who read an article are significantly less likely to share it than people who just read the headline, that may be a sign it’s misleading,” he wrote. “We’re going to start incorporating this signal into News Feed ranking.”