Science

Facebook Just Launched Clever Method to Kill "Fake News"

Facebook is basically fighting its own algorithm now.

by Monica Hunter-Hart
Flickr / opposition24.de

As part of its campaign to fight fake news, Facebook rolled out a subtle new change to news feeds on Tuesday: A “related articles” widget will now appear before a user clicks on a story in their newsfeed, instead of after.

This is a change to a 2013 update by Facebook that introduced three article suggestions after a user clicked back to Facebook when finished reading an article. It’s a clever move in Facebook’s broader campaign against fake news. Facebook intends to “learn from the test, and apply what we learn to improve the product for everyone.”

The “related articles” widget was originally introduced to “help people discover new articles they may find interesting about the same topic,” in the words of Facebook’s announcement on Tuesday. The notice explained that the change will give users “easier access to additional perspectives and information, including articles by third-party fact-checkers.”

Facebook has come under criticism, especially over the last year, for allowing fake news and “filter bubbles” to proliferate on news feeds. Its algorithm is designed to give users more of what they already prefer to click on, so it’s easy for feeds to become political echo chambers and for untrustworthy publications to be read over and over again.

Adjusting the time at which “related articles” appear is likely to make a difference for those phenomena, even though it make seem like a small change. Humans possess an “anchoring bias,” as it’s called in psychology, which is a tendency to privilege the first piece of information we receive when making a decision. If we see our friend’s shared article first, we’re more likely to believe it than whatever we read next. If we initially see an article that was not curated by people within our social group, we’re more likely to be skeptical of the shared article when we read it. And that discourages users from allowing shared content to confirm their preexisting beliefs.

Facebook provided this example in its press release.

GIF via Facebook

Basically, Facebook is fighting against the tide of its own algorithm. Instead of allowing you to only see what you’re likely to prefer, Facebook wants you to also see what it prefers. Inverse reached out to Facebook for clarification on the formula it uses to determine which “related articles” will appear when users share a story, and how Facebook determines the reputability of sources. We’ll update when we hear back.

This rollout comes within the context of Mark Zuckerberg’s gradual reevaluation of his belief that Facebook should be a “neutral” platform, according to his recent series of interviews with The New York Times Magazine. For a long time, Zuckerberg insisted that fake news was not Facebook’s problem to solve, but he’s been slowly walking back that opinion by implementing changes like this one.

It also coincides with Google’s efforts to change its search engine algorithm to disadvantage fake news.

Facebook’s steps to combat misinformation may not be yet perfect, but the fact that it’s creating these tests is promising.