“Prebunking” false information with short videos could nudge people to be more critical of it, suggests a new study from researchers at the University of Cambridge and Google’s Jigsaw division. The study is part of ongoing work in the field of mis- and disinformation, and it’s encouraging news for researchers hoping to improve the online information ecosystem — albeit with many caveats.
The Jigsaw and Cambridge study — which also involved researchers from the University of Bristol and the University of Western Australia, Perth — is one of several attempts to “inoculate” or “prebunk” people against disinformation instead of debunking it after the fact. Published in Science Advances, it recounts the impact of a video series about common tactics often used to spread false information, including scapegoating, false dichotomies, and appeals to emotion.
The roughly 90-second videos didn’t discuss specific false narratives or whether a given piece of information was factual. They typically used absurd or funny examples drawn from pop culture, including Family Guy or Star Wars. (Anakin Skywalker’s claim that “if you’re not with me, then you’re my enemy” is a classic false dichotomy.) The goal was to highlight red flags that might short-circuit people’s critical evaluation of a social media post or video, then to see if that translated into wider recognition of those tactics. Avoiding factual claims also meant viewers weren’t judging whether they trusted the source of those facts.
“We wanted to remove any of the possible politicization that has sort of been confounding the question,” says Jigsaw head of research and development Beth Goldberg.
Prebunking has been promoted as an anti-misinformation strategy for years, especially after research suggested that fact-checking and corrections might not change people’s minds and can even backfire. (Some of this research is disputed.) But as with other tactics, researchers are still in the early stages of measuring its effectiveness, particularly on social media.
Here, the study found encouraging results. In five controlled studies involving 5,000 participants recruited online, people watched either one of the prebunking videos or a neutral video of a similar length. Then, they were shown fake social media posts, some of which used the tactic in the video. People who had seen the videos were overall significantly better at judging whether these posts used the manipulation tactic, and they were significantly less likely to say they’d share them.
The team also conducted a larger study (of around 22,000 people) on the Google-owned platform YouTube. They purchased ad space to show prebunking in front of random videos. Within 24 hours, they followed up with questions similar to the ones described above, judging people’s ability to recognize manipulation tactics. As before, the viewers performed better than a control group but, this time, with a longer gap after watching the video — the median was about 18 hours.
Future research is designed to push that timeline further, seeing how long the effects of the “inoculation” last. Jigsaw also wants to test videos that address specific topics, like false narratives about refugees in Europe. And because this research was conducted in the US, future studies will need to test whether other groups respond to the videos. “The framing around self-defense — someone else is trying to manipulate you, you need to equip yourself and defend yourself — really resonates on both sides of the political aisle” in the US, says Goldberg. “You can really see that tapping into this American individualism.” That doesn’t necessarily generalize to a global scope.
Interestingly, the study’s results seemed independent of people’s predisposition toward conspiracy theories or political polarization. In the controlled studies, participants took surveys evaluating these and other qualities, but the results didn’t correlate with their performance. “I would have anticipated that a high conspiracy mentality means that you would be bad at discerning things like fear-mongering,” says Goldberg.
One possible explanation is that their study stripped out the signals — like specific sources or political topics — that triggered conspiratorial or polarized thinking. Another is simpler: “I think in part, we were paying folks to pay attention,” says Goldberg.
Cambridge researchers have made earlier findings suggesting that prebunking might work, including a study built on a pandemic-themed game called Go Viral! The recent study demonstrates the potential effects of shorter, simpler interventions. But it also comes with significant limits. Even within the study, some videos were more effective than others: the video on scapegoating and another on incoherence, for instance, didn’t change people’s willingness to share posts using those tactics. Outside this particular experiment, the group is still evaluating how long people might retain the lessons they’d learned.
And the group is still far from testing whether prebunking will make people critically evaluate information they want to believe from sources they like — which is how a lot of false information spreads across social media. “The Holy Grail will be: can we actually measure, in the moment, if you’re able to apply that prebunking lesson and recall it a week later when you see Alex Jones using emotional language?” says Goldberg. “I’m not sure that we will get significantly closer in the near term.” But for now, the work opens the door to more research on whether a misinformation vaccine makes sense.
Source: https://www.theverge.com/2022/8/24/23317286/google-jigsaw-cambridge-study-prebunking-disinformation-research