Does Fact-Checking Really Work?

Two pigeons fighting
Who will win? Misinformation or correction effects? Photo by shraga kopstein on Unsplash

Fact-checking is often praised as an effective tool against misinformation and deep-seated misconceptions. Journalists and independent organizations verify statements to make public discourse more objective. But how effective is this practice really? Can a simple correction truly rectify ingrained false assumptions or even simple misinformation?

This post is part of the series “Studies of the month” in cooperation with klimafakten.de. In the series, members of the climate communication lab (PI: Prof. Dr. Michael Brüggemann) share their expertise on noteworthy recent studies.

Which questions do this month’s studies address?

To understand the impact of fact-checking, we are looking at two major research papers in this edition:

Chan & Albarracín (2023): A comprehensive meta-analysis on “correction effects.” It examines how people respond to corrections when they have been presented with science-related misinformation (including about climate change) shortly beforehand.

Dan & Coleman (2025): An experimental study from Germany focusing specifically on correcting existing misconceptions. It chiefly compares whether video-based fact-checks are more effective than traditional text formats.

Which methodology was used, and why is it reliable?

The first study (Chan & Albarracín, 2023) uses a meta-analysis to statistically summarize results from 75 individual investigations. This approach is highly reliable as it helps eliminate “noise” in the data and identifies patterns that persist across different studies.

The second study (Dan & Coleman, 2025) stands out due to its experimental design and high-quality “stimuli”—the texts and videos that participants engage with. Through random assignment to video, text, or control groups, the researchers could determine a causal link regarding which medium best supports the correction of false knowledge.

Furthermore, it is particularly relevant because it addresses deep-rooted misconceptions rather than just debunking a single, isolated statement.

What are the key findings, and why are they relevant for climate communication?

To understand the results, it is helpful to differentiate between two effects: the “Misinformation Effect” (how beliefs change in response to false info) and the “Correction Effect” (how accurate information influences beliefs).

The meta-analysis by Chan and Albarracín (2023) primarily asks whether the correction effect is strong enough to counter the misinformation effect. The core finding is somewhat sobering: on average, the misinformation effect outweighs the subsequent correction. This means a correction often cannot fully undo the damage caused by the initial lie. However, several factors influence this probability:

For instance, Chan and Albarracín highlight that “polarization [regarding the topic at hand] makes the correction less effective, presumably because the recipients feel threatened in their identity and therefore mentally argue against the correction” (2023, p. 1518).

On the other hand, a particular characteristic of the misinformation itself seems to help make corrections more effective: when the misinformation triggers “negative” feelings—such as uncertainty, sadness, or anger—recipients appear to be more receptive to the correction.

According to Chan and Albarracín (2023), it makes no difference whether the respondents are from the USA or another country, or whether the misinformation concerns politics, the environment, or other topics (though corrections perform worse for health topics). Similarly, whether the correction is highly detailed or whether it aligns with the respondent’s existing beliefs does not seem to make a difference.

Videos Work Better

In contrast to the sobering results of the meta-analysis, Dan and Coleman (2025) show that deeply ingrained misconceptions can indeed be corrected. This discrepancy can be partly explained by the fact that in this study, the misinformation was not repeated; instead, the effects of the accurate information (the “correction” effect, see above) were measured.

A key insight from the work of Dan and Coleman (2025) is that videos are more effective than information consisting of solely text. According to the authors, “videos are more realistic and therefore more credible; they are also easier to process” (p. 794). Those who previously held false knowledge or were uncertain benefited most from the video format.

According to the authors, a primary reason for the success of videos is their so-called “processing fluency.” Because information from videos is received simultaneously through the eyes and ears, the brain processes the content more smoothly than when actively decoding text. This lower cognitive load ensures that new, correct information is accepted more quickly and anchored in the memory.

What can be derived from the study for practical application?

Although fact-checking directly after exposure to misinformation does not completely cancel out its negative effect, research shows that correction effects are remarkably stable across both countries and topics. Refutations work not only for new information but can also effectively overwrite existing misconceptions in a person’s memory.

The meta-analysis clearly shows that it is counterproductive to repeat misinformation in too much detail, as this can unintentionally reinforce it. Providing a detailed account of the correct information is far more effective than a detailed deconstruction of the error.

Conveying information through both visual and auditory channels helps increase what is known as “processing fluency,” allowing the brain to process content more smoothly. This lower cognitive barrier ensures that facts are accepted more quickly and remembered better than in walls of text.

Dr. Anne Reif (University of Hamburg) adds that “misinformation is often found on video platforms (e.g., Instagram, TikTok, YouTube), and therefore corrective content is needed there as well, providing viewers, for example, with arguments they can use in personal conversations with others.”

When polarization in discourse is successfully reduced, the target group’s willingness to accept scientific facts increases, even when those facts contradict their own worldview. Chan and Albarracín (2025), for instance, suggest asking recipients to think of acquaintances with different political views as a way to reduce emotional polarization.

Finally, fact-checking unfolds its greatest impact where genuine ignorance or uncertainty prevails. Ideally, these target groups should be reached instead of addressing people who already hold correct views or who can recognize misinformation as such themselves. According to Dr. Michael Brüggemann (University of Hamburg), there is, for example, “a great deal of ignorance regarding basic questions of climate policy, and fact-checking can be very effective here.”

Further Reading

Chan, M. S., & Albarracín, D. (2023). A meta-analysis of correction effects in science-relevant misinformation. Nature Human Behaviour, 7(9), 1514–1525. https://doi.org/10.1038/s41562-023-01623-8

Dan, V., & Coleman, R. (2025). “I’ll Change My Beliefs When I See It”: Video Fact Checks Outperform Text Fact Checks in Correcting Misperceptions Among Those Holding False or Uncertain Pre-Existing Beliefs. Communication Research, 52(6), 778–802. https://doi.org/10.1177/00936502241287870

The Climate Matters Newsletter

Thank you for your interest. Please provide your email address and accept the privacy policy below.