YouTube provides a buffet of content to its one billion users. That content isn’t always information, even if it appears to be at first glance. Through its algorithm, the video-sharing website has prompted the watching of videos that propagate conspiracies and misinformation. And according to an analysis published Thursday in Frontiers in Communication, the majority of videos that touch on climate change inaccurately represent established science.
In the study, corresponding author Joachim Allgaier Ph.D., reveals that searching YouTube for terms related to climate and climate engineering pulls up videos containing actual science less than half of the time. Furthermore, some scientific terms, like “climate modification,” have been hijacked by conspiracy theorists so that searches provide only non-scientific content.
Allgaier is a senior researcher at the RWTH Aachen University and a YouTube user himself. He recognizes the importance of the platform as an easy place to share information with thousands upon thousands of people and believes many YouTubers are already doing a great job informing people and inspiring them to understand things better.
“A science YouTuber I recently had a talk with said that from her point of view YouTube, for many people, is the new Wikipedia,” Allgaier tells Inverse.
However, as things stand, that may not be enough to drown out combating content from science deniers.
In the first part of this study, Allgaier used ten search terms (including “climate hacking,” “global warming,” and “climate modification”) to find and analyze 200 videos, the oldest of which was uploaded in September 2008 and the newest in October 2018. They ranged in length from just 37 seconds long to over two hours. While he searched, he used an anonymization tool to avoid personalized results.
His search revealed that 107 out of the 200 videos supported world-views that oppose the scientific consensus. Within that group, 16 videos outright denied human-caused climate change, while 91 videos pushed conspiracy theories about climate engineering and climate change.
Meanwhile, 89 videos did support the scientific consensus that humans caused the climate crisis. The last four walked the line and included climate scientists discussing climate topics with climate change deniers.
But when it came to viewership, the videos that supported science (with 16,941,949 views) received slightly more clicks than those that opposed science (with 16,939,655 views). That’s largely because the videos that supported science were clips from mainstream, popular shows and channels, like Last Week Tonight with John Oliver and National Geographic.
Amateur clips presented stiff competition. For example, while John Oliver’s clip on climate change received 4.9 million views, a conspiracy video titled “Something unseen is happening worldwide (2017-2018)” by a user named THAT IS IMPOSSIBLE received 5.3 million views. (At the time of publication, the viewership is closer to 5.4 million views.)
Certain search terms seem to have been “hijacked” by conspiracy theorists. The majority of videos that came up during a search for “geo-engineering,” “climate modification,” and to a lesser degree “climate engineering” promoted the chemtrails conspiracy theory. That widely disproven idea posits that the condensation trails of airplanes are deliberately laden with toxins and other harmful substances in order to achieve large-scale weather modification or mind control.
“Some of the people creating these videos are explicitly asking their followers to use the ‘geoengineering’ term and not ‘chemtrails’ to distribute these videos because the latter would lead people to the explanation that it is a conspiracy theory, if they Google the term,” Allgaier explains.
While scientists aren’t capable of geoengineering yet, Allgaier says that “in a few year’s time when it might be necessary to have an actual societal debate about whether or not geoengineering methods should be applied” we’re going to turn to a YouTube full of conspiracy theories.
“This is a matter of concern that should be taken seriously by the scientific community and civil society as a whole,” he emphasizes.
Allgaier argues that because YouTube knows how to ensure its algorithms prioritize scientifically accurate videos it should do so. For its part, YouTube announced in January that it would take a “closer look” at how it can reduce the spread of videos that come close to but don’t quite cross the line of violating its policies.
Now, YouTube suggests new videos to users on the basis of what they’ve already watched, taking into account how long a person has watched a video and how often a video is viewed until the end. It also calculates in metrics like a video’s “likes” or “dislikes.”
In practice, the algorithm can steer people who watch a mainstream video toward a conspiracy theory video on a related topic, then suggest more and more conspiracy theory videos. After criticism over its role in online movements like Pizzagate, YouTube has tried to make some changes, like adding Wikipedia-linked fact checks along with its search results.
But while the company intends on making changes to its algorithm, it will not delete any conspiracy theory videos from the platform. People who want to watch can still find them by searching for the videos, which YouTube says “strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”
So, if scientifically inaccurate and conspiracy-spreading videos are still online, what is there to do? Allgaier thinks one solution could be more videos created by scientists and science advocate and flood YouTube with accurate information.
He also hopes that the academic research community will start to take the societal impact of platforms like YouTube more seriously.
“There are just so many things we don’t know,” Allgaier says. “Who is creating and distributing what kind of content on YouTube? How do people make sense of the videos they find? What do they think is credible, and what is not?”
The online video-sharing website YouTube is extremely popular globally, also as a tool for information on science and environmental topics. However, only little is known about what kind of information users find when they are searching for information about climate science, climate change, and climate engineering on YouTube. This contribution presents results from an exploratory research project that investigates whether videos found on YouTube adhere to or challenge scientific consensus views. Ten search terms were employed to search for and analyze 200 videos about climate and climate modification topics, which are contested topics in online media. The online anonymization tool Tor has been used for the randomization of the sample and to avoid personalization of the results. A heuristic qualitative classification tool was set up to categorize the videos in the sample. Eighty-nine videos of the 200 videos in the sample are supporting scientific consensus views about anthropogenic climate change, and climate scientists are discussing climate topics with deniers of climate change in four videos in the sample. Unexpectedly, the majority of the videos in the sample (107 videos) supports worldviews that are opposing scientific consensus views: 16 videos deny anthropogenic climate change and 91 videos in the sample propagate straightforward conspiracy theories about climate engineering and climate change. Videos supporting the scientific mainstream view received only slightly more views (16,941,949 views in total) than those opposing the mainstream scientific position (16,939,655 views in total). Consequences for the public communication of climate change and climate engineering are discussed in the second part of the article. The research presented in this contribution is particularly interested in finding out more about strategically distorted communications about climate change and climate engineering in online environments and in critically analyzing them.