Science

How the "Knights of New" Became Fake News Pawns

Criticisms of Facebook aren't misplaced, but Reddit has problems as well.

Flickr / Tim Cummins

Tim Weninger doesn’t read the internet because Tim Weninger doesn’t trust the internet. He trusts specific publications, never platforms, and never upvotes. His research into honest-to-God fake news has made him wary of the crowd and has made him aware of how propagandists and opportunists can leverage social dynamics and simple math to foment confusion or make a buck. And, yes, he knew all this before the 2016 election because of a 2015 study he conducted on how social media affects the click rates of real and manufactured news stories. What he discovered was the potential for an epidemic of falsehoods.

Then the inevitable happened.

When they were first collected, the study’s findings were ahead of their time, looking at a problem that had yet to really hit the mainstream. Now, in the “fake news” era, the data and the method are both worth revisiting. Inverse spoke to Weninger about how news and fake news have evolved and what it felt like to see a problem coming and not be able to stop it.

Can you talk a bit about the research you’ve been conducting into news and social media?

What we are interested in is finding out how much online voting matters in swaying and influencing what news and media that people see, and eventually how many people see that piece of media. The votes that I have been looking at are on Reddit. We have a computer agent that went to Reddit every two minutes for several months, and got the newest post regardless of content. Then the computer agent up-voted it, or it down-voted it, or it did nothing as a control.

We went back four days later, we collected that post and looked at how many votes it eventually received. Sometimes when we did this vote (we call this vote injection), we did it immediately, or we did it after thirty seconds, or a minute, or an hour. In general, we found that if I were to upvote a post within the first hour, then that post’s score was significantly higher than if I didn’t upvote the post. If I downvoted the post, its final score was significantly lower than if I did nothing. Finally, posts that were upvoted were 25% more likely to reach the front page than posts that we didn’t vote on.

We did the same thing for comments, and we found the opposite effect. Upvoting a comment had no effect on its final score, but downvoting dramatically hurts a comment’s final score on average

The so-called ‘Knights of the New,’ who are always on the lookout for the latest viral hit, are a pretty powerful subgroup, then.

That’s exactly right. When these users vote on new content they drive the ranking bias effect. The ranking bias effect describes that if you upvote something, it gains in popularity, and more people see it. The more people who see it have more of an opportunity to vote on it. The more people who vote on it, the more people see it, and it becomes kind of a snowball effect.

Lots of types of other effects happen because of the importance of voting. For example, if you post something, you could upvote your own post and you can also downvote competitors. People who vote early have a dramatic impact.

And that’s just one person’s vote. We didn’t test two people or three people or four people because that might really manipulate this live system, which would be unethical. Our single-vote injection had a bigger effect than we expected.

When you see current events, what do you think? Does our political situation seem like just the continuation of what you’ve been studying? Or are things actually getting worse?

Your votes matter significantly. What we’ve been finding is that the things that may become popular aren’t always the ones that would become popular under different ranking systems. It’s important to understand the social bias effects and the ranking bias effects we see in social media.

I don’t want to get too much into political ramifications of this, because I think it’s unwise for a lot of scientists. The ideas we’re talking about apply equally to liberals and conservatives, Yankees fans and Red Sox fans. For media in general, I think we have to come to terms with the fact that the idea of popularity has changed in the past five or ten years. It used to be, not that long ago, that there were professional journalists and mass communication experts and editors who decided what was important, and who ought to hear about it. That was a handful of people. It was some people at television stations and newspapers who decided what news and opinion was “fit to print”. Now, that has flipped; we now have CNN reporting on what’s trending on Twitter. It’s flipped.

Now the people decide what is popular. This isn’t a bad thing, but it’s certainly different. It is critical that we understand thoroughly how this new paradigm differs from what used to happen five or ten years ago.

These ideas mostly center around genuine voters, but what about deliberately malicious bad actors? What impact do they have?

We don’t have any direct evidence that there are bad actors manipulating things. Now, I can assume without evidence that, certainly, that has to happen. There has been some recent investigative journalism that showed how people can buy votes and use those votes to promote some product, and it happens all the time where people buy Twitter followers. On Reddit, you can easily go and buy votes.

How can social networks minimize these issues?

In social media, there are several types of social influence effects. On Reddit, we have the ranking bias effect. On Facebook and Twitter, there are other network effects called cascading effects. For example, if someone popular retweets your tweet, it’ll get more views. Then, more people will retweet what popular people retweet. They key is to understand how these influences make your system, whatever system it is. If it’s Reddit’s voting system or Twitter’s retweet system, we can hopefully control for these biases to find out what is truly trending, or what is truly an article or image of high quality.

Are you aware of any differences of how news propagates through largely anonymous versus largely friend-based types of networks?

There are big differences actually. That’s why I study Reddit and not Facebook and Twitter. On Facebook and Twitter, you see what your friends have decided to share. Which is fine, as their liking or sharing is a type of personal recommendation that you trust. By following someone, you are telling the system that you trust this friend and the news and opinions that they share.

On Reddit, and other social non-networks, it’s an anonymous system. There is no social network on Reddit, and so the dynamics are completely different. There is no explicit personal recommendation of information on Reddit. All there is is upvotes and downvotes. If more people upvote some posts, more people see it. There is no person that is identified behind the sharing of that information, so users need to trust the anonymous crowd and their anonymous votes.

Generally, where do you see news distribution going in the future?

People need to understand that what you see on Facebook is not Facebook telling you that this is important. It’s your friends who are sharing what they like, but what they share is only what has been previously shared or submitted by others. These things are not always immune to manipulation. It is possible, and in fact likely, that a lot of the things that we see have been purchased, which, whether you realize it or not, affects the people and information that we trust.

The second thing is to help make social media more immune to certain social influence biases. To try to understand the dynamics that support our media distribution systems, and to provide people high-quality content. Social media systems want the things that they recommend to users to be of high quality, that’s what keeps users coming to their website, but it’s unclear as of yet that they’re doing a good job. What they’re doing instead is sharing what is popular. What is popular is not always of high quality.

It’s a matter of trying to make popularity and quality, whatever quality means, closer together. How we do that, I don’t know. I do know that it’s going to be extraordinarily important in the future.

This interview has been edited for brevity and clarity.

Related Tags