Considering the 2020 pandemic, you might forget 2019 had its own public health disaster. There were two outbreaks of measles in Washington State last year, bolstered by an anti-vaccination sentiment that festered online.
Analysis of millions of Facebook pages during the 2019 measles outbreak revealed that groups promoting distrust of science, like anti-vaccination groups, are better positioned to access undecided people than public health authorities are. These groups also use diverse and appealing narratives to sway people towards their worldview
These findings were published Wednesday in Nature.
Fake news might only be a small part of the American news diet, but fringe theories about vaccination have pushed themselves to the center of online discourse. During the 2019 measles outbreak, anti-vaccine communities grew by as much as 300 percent, whereas no vaccine-positive community grew by more than 100 percent. Most vaccine-positive groups grew by around 50 percent, the study reveals.
First author Neil Johnson is a professor of physics who studies collective behavior on social media. He tells Inverse that the power of and reach of these fringe groups came as a shock.
"What we thought would be fringe is actually the core," Johnson explains.
Johnson's work is focused on measles, but he is now following coronavirus conspiracies, including the anti-coronavirus-vaccine sentiment. The parasitic tendencies that made anti-vaccine ideas propagate so successfully aren't unique to anti-vaxxers — they apply to many other fringe ideas related to the pandemic.
"We've been monitoring this every day, every hour every minute," he says. "It is like a perfect storm."
The center of the universe – Johnson's study provides a map of how anti-vaxxer groups (technically called clusters) reach undecided people on Facebook.
You can think of one individual Facebook page (like the homepage of a group) as a cluster of people. When one of these clusters links to another page, a bridge is connected between the two. People in one group are more likely to see content from the other in their feed.
In Johnson and his colleague's map (seen below) green represents the undecided clusters. Those undecided people are often new parent groups or people who are interested in vaccines but don't know how they feel about them. The red lines are hard-core anti-vaxxers. The blue are public health messengers like the Gates Foundation who promote science and vaccines.
Among 100 million Facebook users interested in vaccines, the anti-vaxxers (red) tend to be closer to the center of the network than the blue public health messengers. Because of that, the anti-vaxx groups were able to reach more undecided people than those promoting real science.
This means that, while in the graph above it may look like the blue pro-science groups are reaching green groups of undecideds, they are making less of an impact because of their positioning in the network. Those groups are simply controlling a few battlefields. Meanwhile, the red anti-vaxx groups are in the center of it all.
Misinformation and coronavirus – Johnson says that there are few key reasons that anti-vaxxers tend to have a more central position in a network. One is that there are many medium-sized groups that promote their ideology but don't attract much attention (as opposed to one or two large groups).
The way anti-vaccine groups deliver their message also adds to their appeal, Johnson explains.
Established medical groups and agencies tend to offer just one "flavor" of a message — that vaccines are safe and save lives. It's correct but "it’s a kind of vanilla message," says Johnson.
Anti-vaccine messaging is varied so it can pull at many different strings of mistrust – mistrust of the government, mistrust of science, or simply intense concern over the health of a child.
"It almost sounds like the ideal ice cream parlor," Johnson says. "Go in and you can find any flavor you want."
Eventually, one flavor hits soft spot that makes someone sympathetic to their message. It also allows them an endless number of ways to link into seemingly unrelated groups.
Those widely appealing flavors are even more at play during the pandemic, says Johnson, because every group you can possibly think of is discussing Covid-19. That provides endless inroads for groups promoting their own agendas — from reopening states early to xenophobic messages related to coronavirus — to reach even more people.
"Some of [these groups] are things like pet lovers or dog walkers," says Johnson. "They usually talk about dogs or sport; now suddenly they're talking about Covid-19. Somehow that page got linked into these reds [the fringe theory groups red in his model]."
That establishes a link, and ultimately people who wouldn't think to seek out false information, unwittingly, become confronted by it.
Stopping conspiracy theories – Importantly, Johnson also says that there are ways to break up these communities.
Johnson is currently investigating an anti-vaxxer community in Canada (he did not name the group) that has a CDC banner posted at the top of the forum.
"That's almost like a red flag to them. It's the CDC that they don't trust," says Johnson.
He proposes that the best way to break these communities up is from the inside. If you can point out ways that the communities disagree, he says, it can dissuade people from feeling that this community is actually serving the flavor they're looking for. The community becomes less of a refuge.
Now is a particularly good time to show would-be conspirators that they may not totally agree with their online communities, says Johnson. The present environment is already riddled with mistrust and fear — realizing that the group you're unwittingly joining contributes to that environment may dissuade further membership.
Public Health agencies will still need to reach these vulnerable communities, but getting their message across may start with stopping fringe groups from controlling the discourse. Especially now that every group from business owners to dog walkers is talking about Covid-19, the balance could tip either way.
Abstract: Distrust in scientific expertise is dangerous. Opposition to vaccination with a future vaccine against SARS-CoV-2, the causal agent of COVID-19, for example, could amplify outbreaks, as happened for measles in 2019. Homemade remedies and falsehoods are being shared widely on the Internet, as well as dismissals of expert advice. There is a lack of understanding about how this distrust evolves at the system level. Here we provide a map of the contention surrounding vaccines that has emerged from the global pool of around three billion Facebook users. Its core reveals a multi-sided landscape of unprecedented intricacy that involves nearly 100 million individuals partitioned into highly dynamic, interconnected clusters across cities, countries, continents and languages. Although smaller in overall size, anti-vaccination clusters manage to become highly entangled with undecided clusters in the main online network, whereas pro-vaccination clusters are more peripheral. Our theoretical framework reproduces the recent explosive growth in anti-vaccination views, and predicts that these views will dominate in a decade. Insights provided by this framework can inform new policies and approaches to interrupt this shift to negative views. Our results challenge the conventional thinking about undecided individuals in issues of contention surrounding health, shed light on other issues of contention such as climate change, and highlight the key role of network cluster dynamics in multi-species ecologies.