Study shows lies perform better on Facebook if they’re right-wing-flavored

Share left-leaning lies on Facebook and they bomb. Share right-wing ones and they get lapped up.

A man's silhouette is seen against a red background. His brain is open with various things coming ou...

Naturally, how much engagement news content receives on Facebook varies greatly depending on the source, what it’s about, and who the audience is. But a recent study shows that it often depends on the partisanship of the publisher, too. New research has found right-wingers’ falsehoods thrive on Facebook, while lies from leftists tend to perform poorly.

New York University's Cybersecurity for Democracy Center studied 2,973 Facebook pages belonging to American news outlets and noticed the worrying pattern. Now, as Wired points out, that’s not to say Facebook deliberately amplifies right-wing misinformation, because it won’t reveal those metrics. But because Facebook prizes engagement, it’s clear right-leaning falsehoods far outstrip left-wing ones when it comes to how enthusiastically users react to them, and how widely they're spread.

The study might explain why Facebook has struggled to combat hoaxes and incorrect news about COVID-19 and the 2020 presidential election. Or at least explain some of it. The other explanation is that Facebook only acts to combat lies when it’s got a reason to, otherwise it’s all just more fodder for the News Feed mill.

The methodology — The NYU team analyzed the content for partisan rhetoric and authenticity with the help of two agencies, Media Bias / Fact Check and NewsGuard. Through their analysis, the team categorized different outlets into far left, far right, somewhat right, somewhat left, and centrist. The team also checked whether or not posts that were spreading false claims — such as medically unsound advice or information — were being flagged for their falsehoods. The team used CrowdTangle, which helps to examine activity and engagement with posts.

The timeline for the research spanned from August to January and involved in-depth analyses of how many likes and comments these posts received. What they found was eye-opening. "Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week," the study notes, "while non-misinformation sources had an average of 259 weekly interactions per thousand followers."

Meanwhile, “Center and left partisan categories incur a misinformation penalty,” the team added, “while right-leaning sources do not.” When they say “misinformation penalty,” the researchers mean “a measurable decline in engagement for news sources that are unreliable.” In other words, left-leaning readers are less susceptible to misinformation and tend to punish rather than revere it.

Facebook disagrees — Facebook has, of course, questioned the accuracy of the paper’s findings. A spokesperson for the social network told Wired, “This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook. When you look at the content that gets the most reach across Facebook, it’s not at all as partisan as this study suggests.” Great, we'd like more than just Facebook's word for it, though.

Regardless of Facebook's statement, the implications are clear: the network steers people into rabbit holes where fact and falsehood become indistinguishable and irrelevant, and it fosters echo chambers that amplify conspiracy theories, medical misinformation, and hate speech while worsening political polarization in the process. When you reward engagement not legitimacy, outrageous lies are always going to come out on top. If it bleeds, after all, it ledes.