A “psychological vaccine”: Why prebunking is the best way to fight misinformation
- Misinformation is rampant and exacerbated by social media.
- Fact-checking and debunking are implemented on several social media platforms, but the impact of such regulation is questionable.
- Recent research suggests that “prebunking” — through “psychological inoculation” aimed at increasing people’s ability to identify common misinformation techniques — is part of the solution.
Misinformation is everywhere — and it always has been. Gossip, whethr true or false, has existed since humans could communicate. “Fake” politically motivated news has been a part of American journalism since the Founding Fathers created free speech protections.
What’s different is that social media apps like Facebook, TikTok, and Twitter have exacerbated its spread. A recent review by the World Health Organization, for example, found that social media can spread myths worldwide more quickly than they can be debunked. Moreover, whether in the form of objectively false statements, cherry-picked facts, or manipulated narratives, many people believe this misinformation and share it. And it affects real-world behavior, ranging from policy preferences to health decisions to vigilantism.
So what can we do about it?
Perhaps the most common tactic is to fact-check and debunk false or misleading information. But a recent study by Dr. Jon Roozenbeek and colleagues in partnership with Google’s Jigsaw laboratory adds to a growing line of research suggesting that prebunking may be more effective. Specifically, the team set out to “inoculate” people against misinformation before it could even take hold.
Fact-checking and debunking are not enough
Given the prevalence and real-world impact of misinformation, media companies and governments have taken steps to actively monitor and regulate social media platforms. Several platforms actively police what is shared on their sites. Pinterest, for example, outright bans anti-vaccination posts. And other major media platforms like Google, Facebook, and YouTube use fact-checkers to flag and label questionable material or to promote more fact-based information.
But fact-checking doesn’t always work. Indeed, the efficacy of such regulatory efforts is questionable. First, especially for political topics, it may be difficult to define a bright-line rule for what qualifies as misinformation. Take the case of media that highlights certain facts while ignoring other relevant information. Reasonable people may disagree about whether this is merely “spin” or misleading enough to be regulated.
Second, social media is vast, and misinformation spreads faster than truth, especially when it inspires fear or contempt. Thus, even when misinformation is clear, it is simply impossible to keep up with all of it or to reach everyone exposed to it. Misinformation also persists, even after it has been debunked, and its effects linger. Many people are unlikely to believe fact-checkers, instead being persuaded by their prior beliefs, gut feelings, or social groups.
Even for those who consciously accept that misinformation is false, it is difficult to fully “unring the bell.” The brain’s default is to accept most information as accurate. Thus, barring something that triggers more thoughtful evaluation when first heard — like incompatibility with one’s prior beliefs or incoherence — we automatically integrate misinformation into our broader “mental models” of how events unfolded or how the world works. Once established, such mental models are hard to change.
Additionally, memory is flawed; people struggle to remember which information is true and which is false, especially when the false information is plausible or seems familiar. Debunking may even highlight or remind people of misinformation, perversely increasing its influence.
Prebunking: A “psychological vaccine”
Given the challenges of debunking, the last decade has seen a revival of research in prebunking. Specifically, “psychological inoculation” — essentially exposing people to small doses of misinformation and encouraging them to develop mental resistance strategies — has shown promise in reducing the belief in and spread of misinformation.
The concept of psychological inoculation was proposed by William McGuire over 60 years ago. Described as a “vaccine for brainwash,” it is unsurprisingly analogous to medical inoculation. The goal is to expose people to forms of the misinformation that are (1) too weak to be persuasive but (2) strong enough to trigger the person to critically evaluate the evidence and consider counter-arguments. Ultimately, the person develops an arsenal of cognitive defenses and becomes resistant to similar misinformation.
Though originally developed to counter persuasion between individuals, more recent prebunking research has been applied to social media and fake news. Most inoculation, however, has focused on specific issues — for example, forewarning international leaders that Russia was likely to dispense fake information to justify their invasion of Ukraine in 2022, or notifying people about false information spreading about mail-in voting. While potentially effective, this makes scalability difficult, as specific misinformation cannot always be anticipated in advance.
A universal psychological vaccine
Thus, the aforementioned experiment by Roozenbeek and colleagues aimed to inoculate people not against specific fake news, but against common techniques and tropes used to manipulate and misinform.
The team first created non-partisan, 90-second videos (they’re catchy and available here) about five common manipulation strategies: (1) emotional language (the use of fear, anger, and other strong emotions to increase engagement); (2) incoherence (the use of multiple arguments about the same topic, which cannot all be true); (3) false dichotomies (presenting sides or choices as mutually exclusive, when they are not); (4) scapegoating (singling out individuals or groups to take unwarranted blame), and (5) ad hominem attacks (attacking the person who makes the argument, rather than the argument itself). (Their video about emotional language is embedded below.)
Each video relies on psychological inoculation principles: forewarning of the misinformation, issuing counterarguments to it, and presenting fairly innocuous examples. The goal was to show how each technique could be used on social media to influence people.
Lab testing that involved more than 5,400 participants found that the videos increased viewers’ ability to recognize manipulation techniques used in (fictitious) social media posts about various topics. It also increased their ability to identify untrustworthy information and reduced their intent to share manipulative content.
Prebunking in the real world
But would short videos be effective in the real world? To answer that question, the researchers took their inoculation videos to YouTube. Specifically, the emotional language and false dichotomy videos were used as advertisements. Within a day of viewing the ads, some users were presented with a news headline that implemented a manipulation technique (e.g., for false dichotomies, “We need to improve our education system or deal with crime on the streets”) and asked to identify the technique.
Over 11,400 people viewed the videos and answered the follow-up quiz. As expected, people who had watched the videos were more likely to correctly identify the manipulation technique.
No single psychological inoculation—no matter how catchy, educational, and persuasive—is likely to stop all misinformation. Even in Roozenbeek and colleagues’ study, the ability to identify manipulation techniques on YouTube increased by only about 5%. And these effects may decline over time.
But ongoing efforts to improve awareness about misinformation may strengthen people’s ability to essentially self-prebunk. In a media landscape saturated with ever-changing fake news, “broad spectrum” psychological vaccines that target common misinformation techniques can be a part of the solution.