A study released recently says pre-bunking is an effective technique against the spread of online disinformation and misinformation.
The study, published by Science Advances and led by Cambridge researchers in partnership with Jigsaw, a research branch of Google, said that pre-bunking is a process that seeks to refute manipulation tactics, false facts, and lies before they go viral.
According to the researchers, pre-bunking is like a vaccination, while fact-checking is like treating the symptoms of an illness.
“So, think about when you get a vaccine. It has a microdose of the virus. It’s not the whole virus, but it’s like a little piece of it that your body can recognize.
“It’s the same thing in a pre-bunking video; we show you a little clip of the propaganda, so that you can recognize the manipulation tactics going forward,” Beth Goldberg, who is the head of research at Jigsaw, told the International Fact-Checking Network.
The study exposed millions of YouTube users to 90-second clips that explained manipulation techniques like fearmongering, scapegoating and playing into emotions.
The users were subsequently asked to complete follow-up surveys at later dates.
The surveys tested the users’ ability to determine whether a manipulation technique was implemented.
The five manipulation categories depicted in the videos included emotional language, incoherence, false dichotomies, scapegoating and ad-hominem attacks.
The experiments revealed that users who viewed the emotional language video clips were 1.5 to 1.67 times more likely than the control group to recognize the manipulation technique in the future.
The users who watched the false dichotomies video clips were nearly twice as likely as the control group at recognizing the technique.
Also, users who watched the incoherent video were over twice as good at identifying the technique at a later date.
According to Jigsaw, users’ ability to recognize manipulation techniques increased by 5 per cent on average after watching the videos on YouTube.
The lead author, Jon Roozenbeek, a postdoctoral fellow with the Social Decision-Making Lab at Cambridge, said, “Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated.
“The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education and different personality types. This is the basis of a general inoculation against misinformation.”
Goldberg said, “We had a control group and a treatment group and could actually see whether people paid attention to our ad, and we found that this seems like a pretty useful approach overall.
“The next step will be, ‘How does that affect sharing of misinformation?’ We haven’t gotten there yet. But at least, we know that we’re effectively teaching people these concepts.”
Roozenbeek said anyone who wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation across millions of users can do so at a minuscule cost per view.