Turns out, the idea of “fake news” will likely be around forever. After all, it’s a phenomenon as old as communication — and human nature — itself.
While it’s easy to blame the spread of today’s misinformation on digital platforms like Facebook and Twitter in the wake of the 2016 presidential election, a new study by Pew implies that tech alone won’t fix the problem.
The obvious question to ask after last year’s Election and Brexit vote, among other global events, was: “What’s going to happen with info online?” Lee Rainie, the author’s study, tells Inverse.
The binary question asks whether perception of news will improve in the next 10 years or not, which split the study’s expert respondents right down the middle.
Titled “The Future of Truth and Misinformation Online,” the study canvassed opinions from tech experts on a variety of topics including “fake news.” It found that while half are optimistic the coming decade will see a reduction in false and misleading narratives online, “Others think the dark side of human nature is aided more than stifled by technology.”
The study’s conclusion, Rainie admitted, is that human nature likely won’t change due to many factors, including the tech and information environments we exposed ourselves to.
Given the idea of “fake news” predates tech itself — starting with Gutenberg’s printing press giving witchcraft and folklore a new lease on life, all the way to yellow journalism and propaganda in the 20th century — Rainie says humans typically need time to adjust to new platforms, despite “the bad actors” like ads and algorithm having an advantage.
It took a couple hundred years for opponents of older forms of fake information to come up with procedures and protocols for journalists and media watchdogs. However, in the case of technology, rolling out new product and policies won’t get some to stop calling things fake news.
“Everyone’s at war over the meaning of fake news for their own purposes,” Rainie says. “It’s an anchoring term for this phenomenon,” but the main non-tech related ways to combat it is to first clean up the system as much as we can by restoring high quality journalism. This also means education.
“What we need is a new of public education curriculum on literacy and how to be a smart researcher,” Rainie said.
Mark Marino, Director of Humanities and Critical Code Studies Lab at USC, agreed, telling Inverse that while we can’t predict how “fake news” will play out—hence the Pew study’s sharp division—tech ultimately won’t make it go away. He also stressed that as new outlets and reading habits emerge, it’s now more important than ever to teach audiences how to consume information online.
“Many have been looking towards tech solutions to the problem, but that seems like a dead end,” Marino told Inverse. “That would not only involve restrictions on media, but also cuts out the heart of the problem—which is the need to cut through consumers’ political ideology and help them develop techniques of verifying information and develop healthy information diets.”
Right now, Marino noted, public education strategies include literacy, research and critical reading. “Perhaps another requisite would be some form of verification and information digestion education.”
As Tom Rosenstiel, director of the American Press Institute and senior fellow at the Brookings Institution, put it in the study: “Whatever changes platform companies make, and whatever innovations fact checkers and other journalists put in place, those who want to deceive will adapt to them. Misinformation is not like a plumbing problem you fix. It is a social condition, like crime, that you must constantly monitor and adjust to.”