It’s a bit like pulling weeds: false and misleading information keeps cropping up in conversations about the causes, consequences, and solutions to the climate crisis, severely hampering constructive debate. Tackling each piece of misinformation individually is labor-intensive and time-consuming, so we ask what else can be done to improve the situation in the long term?
In this issue of our series, “Study of the Month”, we explore this question, discussing the publication “Structured expert elicitation on disinformation, misinformation, and malign influence: Barriers, strategies, and opportunities” by Kruger et al. (2024).
Kruger, A., Saletta, M., Ahmad, A., & Howe, P. (2024). Structured expert elicitation on disinformation, misinformation, and malign influence: Barriers, strategies, and opportunities. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-169
This post was originally published in German on klimafakten.de: https://www.klimafakten.de/kommunikation/studie-des-monats-092025-wie-sich-falschinformation-wirksam-bekaempfen-laesst
What question does the study address?
The terms ‘misinformation’ and ‘disinformation’ have become ubiquitous in discussions around public climate change and climate policy debates in the internet age. They distinguish between false information spread without intent to deceive and false information spread deliberately, aiming to mislead audiences.
Fact-checking — the correction of false claims — and so-called pre-bunking (short for “pre-emptive debunking”) — addressing false claims before they circulate — help in specific cases but do not solve the problem at a structural level.
In the study presented here, Kruger et al. ask what would need to change in order to get the spread of false information under control in the long run. To this end, they tackle several sub-questions:
- What are the biggest challenges and barriers in combating mis- and disinformation?
- How can these be overcome?
- Which strategies are most effective?
- And which should be prioritized in the coming years?
Which methodology was used, and why is it reliable?
The authors employed a variant of the “Delphi method” — a structured, multi-round survey used to consolidate expert knowledge. This method is commonly applied across a variety of different research fields, and is useful for gathering expertise that is not well captured in scientific literature yet.
First, an open-ended survey was conducted to map the breadth of expertise. The answers were summarized by the research team and discussed in an online forum among survey participants.
After this round of deliberation, a second survey followed. Its aim was to use statistical measures to assess which strategies enjoyed broad expert consensus and to rank them by policy priority.
The result is a well-grounded snapshot of expert opinion, enriched with numerical data. But, as Dr. Mike Farjam (University of Hamburg) points out, it is “strongly shaped by the participating experts.” In this case, participants came mainly from psychology and computer science, as well as a diverse group of practitioners. However, communication scholars and policymakers were not represented.
What are the key findings, and why are they relevant for climate communication?
The study reports that experts agreed that “a wide range of strategies to counter mis- and disinformation could be effective” (p. 3). Most also emphasized the need for further research to compare different interventions.
Nevertheless, many saw regulation of social media as the most important current strategy. One proposal was the establishment of a kind of media council tasked with promoting transparency about how algorithms spread information and holding platforms accountable. The hope is that such measures would incentivize platforms to act against mis- and disinformation.
Almost equally important, experts stressed strengthening public awareness, media literacy, and critical thinking skills — especially in schools and other educational settings.
The study also highlights barriers to combating mis- and disinformation. A major one is the present lack of public trust in media and political institutions, which is often deliberately fueled by political actors spreading false information. Other challenges include political polarization and cognitive or emotional biases. For example, people tend to accept information more readily when it confirms their existing beliefs (the “confirmation bias”). Such biases are often unconscious and deeply rooted in the human psyche, making them hard to counter.
What are the practical takeaways?
According to the study, many approaches can help fight mis- and disinformation. But experts stressed that more research is needed to evaluate interventions.
Given this, Prof. Michael Brüggemann, Professor of Communication Science, Climate and Science Communication at the University of Hamburg, and long-time scientific advisor to Klimafakten, recommends that communication campaigns should be “well evaluated and ideally designed in collaboration with research institutions.” This would ensure that campaigns build on the latest research and at the same time generate valuable data for future studies.
Raising awareness and strengthening skills
The study underscores that awareness, media literacy, and critical thinking are key defenses against false information. Strengthening these skills can help individuals recognize misinformation and avoid spreading it inadvertently.
In practice, this means that communication projects should not only focus on climate content itself, but also explain why climate misinformation circulates, how it spreads, and the mechanisms it exploits. This approach, known as “inoculation,” can also include raising awareness of cognitive and emotional biases. At the same time, care must be taken not to stigmatize audiences or deepen existing polarization.
Such skills training extends beyond individual campaigns — schools, for example, also play an important role. Promoting better education is therefore a vital task for climate communicators.
Building trust and advancing regulation
Experts see a lack of trust in political, scientific, and media institutions as one of the most important barriers to fighting mis- and disinformation. Climate communicators should therefore also address which organizations and actors can be trusted in which contexts, and (importantly) why this is the case.
Since political actors often spread mis- and disinformation themselves, it is particularly important not to call for blind trust in politics. Instead, individuals require the tools to critically assess the motives and trustworthiness of different actors.
Social media are a significant battleground for misinformation. The experts surveyed called for stronger regulation to ensure transparency about algorithms and hold platforms accountable. Climate communicators should advocate for such regulation at the political level.
In discussions of the study within Prof. Brüggemann’s research group, it was also noted that journalists can play an important role — for example, by challenging politicians on false claims during interviews or press conferences and by correcting these claims publicly.