"Be both skeptical and forgiving of new research on Covid-19."

Mind and Body

What the Sturgis superspreader report actually says about Covid-19

In August, thousands attended the Sturgis Motorcycle Rally. A controversial report on the event claims it can be linked to 266,796 new Covid-19 cases.

In August, thousands attended the Sturgis Motorcycle Rally in South Dakota for ten days of “riding, food, and music.” Approximately, 395,453 vehicles passed through — a 7.5 percent decrease from 2019.

While the cause of the decrease can’t officially be explained, it was going on during the coronavirus pandemic. As Smash Mouth lead vocalist Steve Harwell noted in their concert during the Sturgis rally, “Fuck that Covid shit.”

Harwell’s statement is the opening salvo of a disputed report on the Sturgis rally branding it a “superspreader” event. Released this week by San Diego State University’s Center for Health Economics & Policy, the report, which is not peer-reviewed, estimates that 266,796 new Covid-19 cases throughout America could be linked to the 460,000-person event.

It was an attention-grabbing data point, and the report, which published Saturday, quickly went viral. Initial reaction oscillated between some who described the attendees “stupid” and others who branded the numbers “fake news.”

From there, the controversy surrounding the report started to build. The estimate drew criticism from government officials and scientists. But what the report may truly represent, experts tell Inverse, is a fundamental misunderstanding between the public and researchers over how the process of scientific inquiry works, set against a backdrop of incredibly high stakes and slippery, uncertain data.

In the wake of the event, the South Dakota Department of Health has only found 124 cases linked to the rally through contact tracing, while a survey of health departments by The Washington Post indicates that at least 260 cases across 11 states are connected.

Meanwhile, Governor Kristi Noem called the report “grossly misleading.” The rally’s organizers also put out a statement saying the report is “blatantly faulty.” Leading scientists, like Ashish Jha, dean of Brown University School of Public Health, say the report’s methodology didn’t “pass the sniff test.”

“Working papers” like this report, the equivalent of a “pre-print” paper in medical literature, are assumed to have some issues before they finish undergoing peer review. But the pressure of Covid-19 to publish research, and fast, has pushed researchers to present reports like this to the public as soon as possible. In the best case, the research helps people make better choices and guide further science. In the worst, it can erode public trust in science if the research is judged as flawed.

In between lies a situation that Yotam Ophir, assistant professor at the University at Buffalo who specializes in health and science communication, describes as “truly sad.”

“By not communicating the preliminary nature of the findings carefully enough, the scientists allowed the debate to shift away from what really matters — the fact that the rally was utterly and ethically irresponsible,” Ophir tells Inverse.

What the study claims, and what it can’t — The working paper took anonymized cell phone data and compared it to CDC case counts to show that, a month after the rally, Covid-19 cases in Meade County, increased by approximately 6 to 7 cases per 1,000. The city of Sturgis is located in Meade County, in western South Dakota. It also found the other counties which contributed the highest inflow of participants experienced a 7 to 12.5 percent increase in cases.

The report authors argue the rally resulted in a “superspreader” event, driving 266,796 new cases, and generating public health costs of $12.2 billion. This price tag is based on an estimated cost for non-fatal Covid-19 cases that are severe enough to require hospitalization of $46,000.

"... the scientists allowed the debate to shift away from what really matters — the fact that the rally was utterly and ethically irresponsible."

There are two important points to note here. The first is that the report is primarily an economics paper, not a piece of epidemiological research. The second is that the actual cost of Covid-19 hospital stays can vary immensely, and the authors note that “this is by no means an accurate accounting of the true externality cost of the event,” but argue it’s useful as a ballpark estimate of what superspreader events can cost.

Kevin Griffith is an assistant professor in health policy at Vanderbilt University School of Medicine. He was not involved in writing the working paper. He tells Inverse, that, in his assessment of the report, “Sturgis likely led to thousands of new cases and some deaths, but if the rally led to 250,000 new cases we would see it more clearly in the raw data.”

Ashley O’Donoghue, an economist at the Center for Healthcare Delivery Science at Beth Israel Deaconess Medical Center who recently co-authored the paper “Super-spreader business and risk of Covid-19 transmission,” offers a similar evaluation, drawing attention to issues with the researchers’ control group.

“It’s probably very likely that there was spread of Covid at the event based upon what we know about the transmission of the disease, but it’s probably less than 250,000 cases,” O’Donoghue tells Inverse. She was not involved in the report.

“I think that the way their control group is constructed leads me to believe that their results are an overestimate,” she says.

Gene Haheim sells face shields during the 80th Annual Sturgis Motorcycle Rally on August 7, 2020 in Sturgis, South Dakota.

Getty Images

The authors of the Sturgis report used a synthetic control method, which O’Donoghue says is one of the preferred ways to estimate the causal effect of an event when thinking about questions related to Covid-19. She explains that it allows researchers to create a control group similar to the treatment group, the only difference being in terms of exposure to the event, in this case, the Sturgis Motorcycle Rally.

“The problem is that the control group needs to be really carefully selected, and if it’s not actually a good control group, your estimates won’t be correct,” she explains. “With this study, the selection of the control group is likely what’s causing some overestimation.”

There are likely differences between the treatment and control groups — people who attended the rally, and people who did not — causing issues with the analysis, she says. For example, the report excludes South Dakota counties that don’t border Meade county and other states, with the intention of avoiding spillover effects. O’Donoghue notes that while this “would usually make sense,” the nature of Covid-19 makes such a condition difficult and can lead to overestimation of the rally’s effect.

“Normally, a researcher would conduct what’s called ‘robustness checks’ in economics — or in other fields often called ‘sensitivity analyses’ — where you would try using different control groups and changing other things in your models to show your results hold,” O’Donoghue explains. “This paper only does a little bit of this, likely because it’s still a working paper, but that’s something we’d want to see a lot more of.”

Griffith also points to the lack of re-testing of models in the report, which allows for what he calls “untested assumptions.” The creation of population models inherently means large numbers of assumptions are made, and re-testing allows researchers to see how much their results change if they vary their assumptions.

There are three other overarching issues with the way the study was conducted, Griffith says. One is that there is no good data on individual rally participants. In an ideal world, he says, we would know everyone who attended, who they’ve come into contact with, and if they had tested positive for Covid-19. Instead, the report operates off aggregated state or county-level data, which is mostly composed of non-participants. Because we know rates of Covid-19 vary widely from place to place, Griffith says, “it’s very difficult to determine if a higher number of confirmed cases is due to more testing or greater spread of the disease.”

Another issue: The researchers used cell phone pings from non-Sturgis residents entering Meade County and compared that to CDC Covid-19 case count data, ultimately concluding that the rally caused disease spread both locally and in the home counties of people who traveled to Sturgis and then returned. But case counts in counties where Sturgis participants were coming from were already increasing, so the comparison may be unfair, he says.

Finally, the data has to be viewed via the lens of partisanship, Griffith says. Studies indicate that Republicans are far less likely to perceive Covid-19 as a threat, and in turn are less likely to socially distance and support mask-wearing. South Dakota very much leans Republican.

“States and counties whose residents were more likely to attend Sturgis also tend to be more conservative, so we can’t be sure if it’s truly the rally that spread the disease or geographic differences in other individual behaviors,” Griffith explains.

The difficulty of conducting research during Covid-19 — O’Donoghue and Griffith both have empathy for the report’s authors. The question they are trying to answer is very important — can large events become superspreading events with health and economic repercussions? — and their attempt was a serious one. But they are also trying to answer that question very quickly.

“If it takes two years to get results, that doesn’t help anyone,” Griffith observes. “But going fast has trade-offs, especially in terms of study rigor.”

Conducting research during Covid-19 is extremely challenging, and many of the issues observed with the paper’s methodology reflect these problems. The pandemic has created “a gold rush” for many academic researchers, Griffith says, and everyone's competing to publish as soon as possible.

“Unfortunately, that time pressure is at odds with careful, high-quality research,” he says.

Meanwhile, the lack of nationwide, randomized testing at a population-level “makes it impossible to know the true extent of the disease’s spread,” Griffith says. O’Donoghue also notes that because testing practices vary so much across locations and over time, it can also be difficult to know if what researchers measure is really what is happening. Using mortality data may be more reliable, she says, but because so much time can pass between the time of an event, like Sturgis, and subsequent deaths, it makes the analysis more difficult.

A woman crosses downtown Deadwood, South Dakota during the Sturgis Motorcyle Rally on August 8, 2020.

Getty Images

Ultimately, the controversy offers researchers a moment to rethink and adjust how they distribute research, O’Donohuge says.

“In normal times, it is very common for an economics working paper at this stage to be distributed to your network while the methodology is being refined to get feedback from others,” she explains. It’s clear to experts the difference between a “working paper” and those that are peer-reviewed. But the difference isn’t always clear to the media or the public, who are watching in-progress research related to Covid-19 closely.

“Practices that may have been normal for us before as we’re working on drafts probably need to be rethought more,” she says. “I think that’s something that many of us are learning and trying to adapt to throughout this pandemic.”

Ophir, the expert in science communication, notes that while making preliminary findings available for the public can be necessary, it also “opens the door for additional uncertainty, suspicion, and potentially distrust.”

“If the uncertainty of the findings is not communicated well with the public, people will expect what they read in news articles to be definite and conclusive,” he says. “If it will turn out to be wrong, their trust in scientists and the scientific endeavor will be harmed.”

Moving forward — The Sturgis report is, in some ways, a case study in what happens when three factors collide: research not ready for public consumption, puzzlement over the scientific process, and eagerness for the latest details of the biggest news story in the world. And preventing similar cases of confusion and controversy in the future depends on both researchers and the public.

Ophir observes that, unfortunately, “information during developing crises is inherently incomplete.” Ideas are suggested, supported, tested, and revised over and over again — this is the healthy process of scientific inquiry. But it’s also anxiety-provoking for a public eager to understand an ever-changing situation.

"Be both skeptical and forgiving of new research on Covid-19."

“The biggest takeaway from the Sturgis case is that the scientific community should be extremely cautious and transparent when publishing results that did not go through the rigorous process of anonymous peer review,” he says.

“By not communicating the state of the study clearly, the scientists open the door for valid criticism that may give the average reader the impression that their results are worthless and unreliable. In reality, their reliability and validity are awaiting further scrutiny.”

Readers, for their part, need to better understand that the scientific process takes time and findings published “during an ongoing crisis should be taken with a grain of salt,” Ophir says.

Skepticism toward the conclusiveness of discovery is healthy, he adds, but cynicism is not. We should still listen and trust experts. Ophir advises readers wanting to make sure they understand a situation to check the original source of information, estimate that source’s expertise, and have a diverse media diet.

Griffith also notes that readers should know that while one study is good, more is better — when multiple teams come to the same conclusion, using a variety of methods and data, we can be pretty confident that results are real. Typically, studies like the Sturgis report would go “though many months of criticism, revision, and refinement,” he says, but researchers are racing to produce and publish research that can mitigate the health effects of Covid-19.

“Be both skeptical and forgiving of new research on Covid-19,” Griffith says. “It feels cliche, but we live in turbulent times and medical researchers are struggling just like everyone else.”

What part of the Covid-19 pandemic do you think causes the most confusion? We want to know. Take the Inverse reader survey.

Related Tags