Dubbed a Russian “troll” farm by the United States’ government, the Internet Research Agency (IRA) is a Russia-based organization that has conducted and still conducts covert social media campaigns in an effort to influence American politics. In 2016 alone, the IRA produced more than 57,000 Twitter posts, 2,400 Facebook posts, and 2,600 Instagram posts all dedicated to swaying US public opinion on politics.
The idea that Russian bots like the IRA’s influenced Americans’ political opinions in the 2016 election captured hearts and minds of people across the political spectrum. In 2017, former FBI agent Clint Watts testified to the Senate Intelligence Committee that bots designed to “socially engineer” swing voters in Midwestern states may influence their opinions of what is true, and what is not.
But despite these bots’s potentially insidious efforts, a new report suggests Twitter bots run by the IRA aren’t actually that successful at changing American minds.
The paper, “”Assessing the Russian Internet Research Agency’s impact on the political attitudes and behaviors of American Twitter users in late 2017,” was published Monday in the journal Proceedings of the National Academy of Sciences.
Do Russian bots influence Americans’ attitudes?
That is counterintuitive to popular wisdom: The sheer scale of the IRA’s campaign to change the hearts and minds of Americans to suit its own agenda suggests it would have some effect — experts and the country’s politicians tend to agree.
But scale and intent doesn’t mean that messages always influence public attitudes. We know from past research that political messaging doesn’t necessarily change people’s minds about issues or candidates that they have already formed an opinion on.
In October and November 2017, researchers surveyed a total of 1,239 Republican and Democratic Twitter users about their political opinions.
They asked people to assess how “liberal” or “conservative” they were, whether they agreed or disagreed with conservative-leaning statements, and whether or not they would be unhappy if they had to socialize with a member of the opposing party or have them as an in-law.
The researchers also counted the number of political accounts each user followed before and after their two surveys, and calculated the ideological bias of each respondent’s Twitter network.
Part of what they were interested in was whether they respondents interacted with any of the estimated 4,256 IRA Twitter accounts at any time between the October and November surveys. They then asked: Did those interactions change the users’ politics?
The majority of participants didn’t encounter IRA troll accounts — but a fifth did have some interaction. These interactions tended to be brief, however: They represented on average just 0.1 percent of their liking, mentioning, or retweeting.
None of the Twitter interactions resulted in any changes in political opinions, the researchers found.
Who is talking to Russian bots?
This may be down to who was most likely to encounter an IRA troll.
People with strong partisan beliefs were the most likely to encounter Russian troll accounts, the study found — and it’s hard to polarize a public that’s already significantly polarized.
This stems from a truth about human decision-making. A 2016 study explains that when people’s political beliefs are challenged, the parts of the brain linked to self-identity and negative emotions are activated. In turn, people double down on their opinion and are less likely to change their minds. This is also probably why so many people get into fights on Twitter with bots.
While this study shows that Russian bots are unlikely to influence political opinions of the already-converted, it doesn’t explicitly comment on the role of Russian influence on the 2016 election.
For one, the study sample is unrepresentative: It did not include independent voters, and participants were recruited to equally represent people who identify as “strong” or “weak” partisans. This group also used Twitter at least 3 times a week more frequently than the average American. As a result, trolls may have a stronger influence over the sort of people excluded from this study, the researchers say.
"The American public may not be easily manipulated by propaganda."
But despite those limitations, the findings show that the American public “may not be easily manipulated by propaganda,” the authors write. What remains to be seen is whether these kinds of accounts will have any future affect. And other pernicious techniques used by the IRA may prove more influential than their Twitter accounts: According to Facebook, a number of IRA Instagram accounts have already been created in an effort to influence the 2020 election.
There is widespread concern that Russia and other countries have launched social media campaigns designed to increase political divisions in the United States. Though a growing number of studies analyze the strategy of such campaigns, it is not yet known how these efforts shaped the political attitudes and behaviors of Americans. We study this question using longitudinal data that describe the attitudes and online behaviors of 1,239 Republican and Democratic Twitter users from late 2017 merged with nonpublic data about the Russian Internet Research Agency (IRA) from Twitter. Using Bayesian regression tree models, we find no evidence that interaction with IRA accounts substantially impacted 6 distinctive measures of political attitudes and behaviors over a 1-mo period. We also find that interaction with IRA accounts were most common among respondents with strong ideological homophily within their Twitter network, high interest in politics, and high frequency of Twitter usage. Together, these findings suggest that Russian trolls might have failed to sow discord because they mostly interacted with those who were already highly polarized. We conclude by discussing several important limitations of our study — especially our inability to determine whether IRA accounts influenced the 2016 presidential electios — as well as its implications for future research on social media influence campaigns, political polarization, and computational social science.