Facebook's New Algorithmic Trending Curators Are Spreading B.S.

Getty Images (Justin Sullivan) / Wikimedia Commons

On Friday, Facebook announced that it was doing away with its curated, allegedly editorialized Trending feature in favor of its algorithms. Now, just days after the new hands-free popularity guide’s release, Facebook has embarrassed itself: The algorithms picked up and boosted a story about Fox News host Megyn Kelly with “no factual information…to back up [the] claim.”

In other words, Facebook removed the humans from its Trending equation, and the unbridled algorithm ran wild. Humans, especially those who patrol (and troll) the internet, are fallible: They’ll share and popularize complete bullshit. If these fallible humans share bullshit enough, these now-independent algorithms will regurgitate the bullshit. Viral content is aptly named, for it spreads like a virus.

A story’s appearance on Facebook Trending is like giving pneumonia patients immunosuppressants: The contagion redoubles, and the story is taken as factual. Justification, in 2016, is a reputable source reporting a story; Facebook Trending, under human supervision, became a somewhat reputable source.

The new Facebook Trending look: Gone are the short descriptions.

But now, Facebook’s algorithms are subject to the internet’s whimsical, partial, collective mind. Facebook, in its parlance, makes it sound as though humans still retain some oversight: The change, Facebook writes, will “make the product more automated and will no longer require people to write descriptions for trending topics.” Facebook “always hoped” to make such a change, but would’ve preferred that its algorithms learned how to write summaries from its curators’ habits.

The Trending employees, who are no longer with the company (to put it euphemistically), suspected that they were hired to train the artificial intelligences. Given Facebook’s admission, it seems their suspicions were spot on. Facebook hired journalists to aggregate news, then machines learned their habits. Machines, though, unlike human editors — as is now clear — don’t have great bullshit detectors.

Facebook claims that there are “still people involved in this process to ensure that the topics that appear in Trending remain high-quality,” but this Megyn Kelly story would suggest otherwise.

Related Tags