Explaining disinformation? There’s a far better way to do that!
By Zdeněk Rod
The disinformation problematique has seriously influenced policy, academic, and public debates. The disinformation phenomenon became especially significant after the illegal annexation of Crimea in Ukraine, which has since tremendously reshaped the global security environment. Disinformation, along with many other hybrid instruments, has been blooming in European society, and consequently, the people consuming it have grown increasingly hostile to their democratic settings, government, NATO, or the EU. The primary aim of disinformation is the intentional polarization of society that results in the inability to solve societal issues.[i] Disinformation operations in Russia, for example, have begun widening anti-NATO opinions and anti-migration thoughts as Russia’s disinformation activities have become a well-integrated tactical manifestation of its strategic view.[ii] Although NATO states have spent billions of dollars on defence systems and arsenal, precisely aimed disinformation still has the ability to divide masses with minimal economic costs. This is the result of not having resilient societies. Therefore, it is crucial that resilience is one of NATO’s core tasks as outlined in the NATO 2030 agenda. But the question remains: how can NATO effectively enhance resilience against disinformation?
Since 2014 we have seen a rapid increase in the number of disinformation campaigns—some that even talk about ‘hybrid warfare’ with regard to disinformation.[iii] In response, the vast majority of states have focused on fact-checking, critical thinking, and massive strategic communications campaigns. Yet still, it does not seem that disinformation is in retreat. It is omnipresent in our lives. Social as well as traditional (offline) media disseminate disinformation across the general public. Public opinion polls illustrate that most citizens of NATO member states consider at least some disinformation to be true and relevant. There have been many attempts to map disinformation sources and tackle them rationally, e.g., the project Demagog in the Czech Republic and Slovakia. Governmental institutions are also trying to tackle disinformation, e.g., the Czech Ministry of Interior has tried to uncover COVID-19 disinformation. Why have these approaches not worked? Traditional approaches are mostly based on fact-checking and rationally tackling disinformation. This is critically important, but it is not efficient as humans are not rational. Therefore, in contrast to traditional approaches, this article argues that a behavioural approach targeting disinformation from different perspectives can be a useful tool. Academic scholarship[iv] and recent de-polarization studies[v] have shown that behavioural approaches can effectively counter disinformation; however, this depends on how we approach behavioural nudging.[vi]
The behavioural approach is still a relatively new topic in the research on countering disinformation. Therefore, this article seeks to shed light on the novelty of this topic and analyse a behavioural approach that can be used for formulating disinformation policy in NATO and other international organisations.
(Ir)rationality as human nature?
Human behaviour is non-conscious and not rational. This behaviour, however, is not primarily about irrationality itself, since people are not irrational. In fact, most of their decisions are subconscious, and most of them are perfectly functional; however, some are inefficient, and might be considered irrational. Therefore, this behavior is about the ability to utilize subconscious decision-making processes. These processes are not necessarily irrational, but in some cases, these processes do not result in optimal decisions. This has been demonstrated by behavioural economists/scientists like Nobel Prize winners D. Kahneman and R. Thaler.[vii] Therefore, once the human subconscious has been utilized, critically important factual argumentation not adequately anchored in cognitive biases has limited impact. Why? Factual argumentation relies on heuristic approaches. Heuristic approaches are used when problems appear familiar and when we do not need additional information. This sometimes results in suboptimal decisions that are known as ‘cognitive biases’. Many cognitive biases result in optimal and very efficient decisions, but in some cases, this might result in sub-optimal decisions. Applying cognitive biases might increase the efficiency of manipulation. Hence, cognitive biases, so-called ‘mental shortcuts’, typically violate the rules of logic. Therefore, if factual argumentation is not adequately anchored in cognitive biases, it has a limited impact.[viii]
What can be the most appropriate, specifically tailored approach to counter disinformation if the human subconscious can be utilized? Based on the discussion above, the so-called behavioural approach, combining artificial intelligence (algorithms, natural language processing (NLP), or social listening), neuroscience (functional magnetic resonance imaging (fMRI), facial coding, skin resistance[ix]), and behavioural science (heuristic and biases), has the ability to counter disinformation narratives.
What does the behavioural approach mean?
The behavioural approach combines natural language processing, social listening tools, behavioural science, and neuroscience. It is a unique combination that can support the creation of strategies and mechanisms to take preventive and reactive actions to limit the potential of different types of disinformation in the relevant context beyond online and offline media.
The behavioural approach is a reaction to disinformation campaigns whose creators understand how to manage the subconscious as well as the predictability of human behaviour. They create simple but sophisticated emotional mechanisms driving both credibility and rapid dissemination of disinformation—i.e., conspiracy theories and hoaxes—and in the end, they are able to alter human decisions and behaviour. The actions taken in response to disinformation in the last years, which are largely based in political science, have demonstrated their minimal capability to address and suppress disinformation campaigns. For the successful impact of behavioural approaches, humans’ emotional and irrational sides need to be taped into to exploit the predictability of irrationality. In combination with behavioural science, modern technologies can be utilized to identify sources and create ‘nudges’ that would help address both content and dissemination of hoaxes created in ‘hybrid war’. All this is in relevant context/touchpoints.[x] It can be argued that behavioural science can promote the truth to overcome the Bourdieusian saying[xi] ‘that if there is any truth, it is that truth is subject to struggle’.
The creators of disinformation are fully aware of how to exploit human cognitive biases. We argue that we can use the so-called asymmetric approach to combat this. The vast majority of Western countries, however, use traditional disciplines to counter disinformation, such as factual argumentation—let’s call it the symmetric approach. Based on strategic observations, it is evident that when symmetrical approaches meet asymmetrical approaches, there is a relatively low chance that the symmetrical approach would outweigh the asymmetrical one. An analogous situation can be found in military strategy, which struggles to deal with asymmetric threats when deploying symmetric warfare initiatives. This example is, of course, imperfect; however, general observations can convey the main message to the reader.
Considering the example above concerning the nature of symmetric and asymmetric approaches, the behavioural approach can be seen as a new tool asymmetrically countering malicious disinformation by utilizing and targeting mechanisms preying upon the cognitive biases implicit in human nature. To give an example, traditional approaches generally explain the need for vaccination in the following way: “Getting vaccinated against COVID-19 can lower your risk of getting and spreading the virus that causes COVID-19. Vaccines can also help prevent serious illness and death. All steps have been taken to ensure that vaccines are safe and effective for people ages 5 years and older.”[xii] The behavioural approach would rephrase the aforementioned sentences in the following way: ‘By getting vaccinated you protect not only yourself, but also your love ones.’[xiii] In the second example, it is essential to be very careful when drawing up slogans. Because, for instance, a communication message – e.g., ‘If you don’t get vaccinated, you can die’—can create push back and ignorance.
Cognitive biases: How do they work?
Before we discuss how the behavioural approach could be applied, it is vital to briefly portray how cognitive biases work to better understand the complex nature of this issue. I will further explain different types of cognitive biases. The selection of cognitive bias names and some examples is based on Sarvaš and Kolomaznik’s research project “COVID disinformation and negative impact on the ability to prevent pandemic”, which tackles this particular phenomenon:
1) Herd Mentality
- Herd mentality is when people blindly copy and follow others, e.g., politicians, gurus, celebrities. When they do this, they are being influenced by emotion rather than by independent analysis. There are four main types of herd mentalities: self-deception, heuristic simplification, emotion, and social bias.
2) The Backfire Effect
- The backfire effect refers to strengthening a belief even after it has been challenged. The Backfire Effect has a similar foundation to Declinism, in that we do not like to change our opinions.
3) Declinism
- Declinism examines humans’ tendency to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
4) In-Group Bias
- In-group bias refers to the unfair favouring of someone from one’s own group. You might think that you’re unbiased, impartial, and fair, but we all succumb to this bias, having evolved to be this way.
5) Belief Bias
- We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
6) Messenger effect
- Authorities/celebrities/politicians are considered as more trustworthy than others.
7) The False Consensus Effect
- This is the tendency people have to overestimate how much other people agree with their own beliefs, behaviours, attitudes, and values. For example:
• Thinking that other people share your opinion on controversial topics
• Overestimating the number of people who are like you
• Believing that the majority of people share your preferences
8) The Availability Heuristic
- The tendency to estimate the probability of something happening based on how many examples readily come to mind, e.g.:
• After seeing several news reports of car thefts in your neighbourhood, you might start to believe that such crimes are more common than they are.
• You might believe that plane crashes are more common than they really are because you can easily think of several examples.
9) The Optimism Bias
- A tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.
We can observe several key takeaways from the biases mentioned above that help us to counter disinformation. If decision-makers realize how the cognitive biases work, they can apply a specifically tailored approach for a specific domain that is influenced by disinformation narratives.
How could the behavioural approach be applied?
When using the behavioural approach, we need to apply a so-called game-changing approach consisting of several steps the government should apply to counter disinformation. A game-changing approach is a standardized approach utilized by BIT (Behavioural Insight Team in the UK) and its branches worldwide.[xiv] This process traditionally has three steps:
i) Utilize modern technologies and analytical tools
Artificial intelligence, for instance, is able to detect disinformation since it understands dissemination patterns. Tools such as monitoring and scraping social media – for instance, used by tech-company Gerulata technologies[xv] – or the use of more advanced analytical tools used for marketing – such as software owned by Signal Analytics company[xvi] – to predict which disinformation narratives will have maximum reach can be used for such purposes.
ii) Anchor in behavioural science and nudging
When discussing the disinformation connected to the COVID-19 vaccination, there is a wide range of evidence that behavioural nudges can increase the COVID-19 vaccination rate.[xvii] There are examples that nudges (i.e., small, slight, gentle pushes) can have a significant behavioural impact. This is why several governments created Nudge Units to influence humans to drive energy savings, ecological behaviour, and education for the underprivileged. Nudge Units aim to depart from addressing disinformation campaigns utilizing critical thinking. They reverse the course and utilize the latest technology in combination with behavioural science to address disinformation. Nudges need to be deployed in moments that matter (online, offline, in shops, public transportation). Last but not least, this needs to be done ethically.
The primary example of implementing nudge theory in practice can be seen in the UK. The UK government recently created a so-called behavioural insights team (i.e., the “nudge unit”) that aims to provide the decision-makers with policy recommendations on how to counter disinformation in the COVID-19 era. The UK government uses nudging insights about mental processes to change behaviour through coaxing and positive assertion. Rather than forcing people to do things, nudging tweaks the environments in which people make choices – for instance, by requiring people to opt-out of organ donation rather than opting in.[xviii]
iii) Target those most vulnerable to disinformation in specific touchpoints
By mentioning all touchpoints (places where people interact with information/product/advertising – e.g., bus stop, tram, TV, a shop, office), this approach will target both online and offline media, public transportation, stores, and relevant events, i.e., when information is relevant (e.g., people travelling to work when there is a higher chance to get infected usually means these people pay more attention to information about reducing the risk of infection when entering a tram, subway, bus vs when at home).
Final takeaways
Although the behavioural approach seems promising, it is also important to realize that there are also some studies, such as the article published by the Association for Psychological Science,[xix] saying that a ‘nudge’ may not be enough to counter disinformation, arguing that the effects of the behavioural approach might be overestimated. Furthermore, based on the novelty of the topic, there is still a lack of empirical evidence that could bring clear and tangible results. However, that does not mean that the behavioural approach and the nudging strategy would not work. It is worth enriching current disinformation analyses with new approaches. The first step to promote and analyse the behavioural approach would be to create more nudging units in NATO member states. Additionally, the capacity-building potential of the behavioural approach could also be supported by NATO institutions such as the NATO Strategic Communications Centre of Excellence in Riga and the NATO Public Affairs Office, which is responsible for the NATO Public Affairs Handbook.[xx] Last but not least, the behavioural approach not only should be used by NATO member states to counter disinformation within the NATO area but also in NATO psychological operations[xxi] and NATO military information operations[xxii] to influence the enemy’s army and public. For instance, a behavioural approach might be used during civilian-military (CIMIC) operations to target disinformation that negatively influences locals that may obstruct NATO CIMIC activities and so forth.
As mentioned above, given the fact that factual argumentation does not always work, there is time to consider new approaches. If more NATO states and institutions create a wide range of nudging units, the exchange of best practices and lessons learned would validate whether the behavioural approach should be the dominant approach or just accompany traditional approaches.
Acknowledgments
I would like to thank two disinformation experts, Stefan Sarvas and Tomas Kolomaznik, who helped broaden my horizons in the disinformation problematique.
About the Author
Zdeněk Rod is currently a PhD candidate in International Relations, focusing on conflict management, at the Department of Politics and International Relations of West Bohemian University in Pilsen, where he has been involved in several research projects. Within his PhD thesis, he focuses on implementing security-development nexus approaches in a post-conflict environment. He is also interested in the People’s Republic of China’s role in world affairs and hybrid warfare. He has studied at universities in Ljubljana, Budapest, and Brussels, and conducted several research visits, for instance, at the NATO CIMIC Centre of Excellence in the Hague. He also works as a Minister Counsellor at the Czech Ministry of Defence and as a research fellow at the Centre for Security Analysis and Prevention in Prague.
Notes
[i] Stefan Sarvas, “Dezinformácie: cesta do pekiel nefunkčnej spoločnosti,” infosecurity.sk, 6 September 2021, https://infosecurity.sk/dezinfo/dezinformacie-cesta-do-pekiel-nefunkcnej-spolocnosti/.
[ii] U.S. Department of State, GEC Special Report: Pillars of Russia’s Disinformation and Propaganda Ecosystem (Washington: GEC, 2020), 5.
[iii] Arsalan Bilal, “Hybrid Warfare – New Threats, Complexity, and ‘Trust’ as the Antidote,” NATO Review, 2021, https://www.nato.int/docu/review/articles/2021/11/30/hybrid-warfare-new-threats-complexity-and-trust-as-the-antidote/index.html;
Flemming Hansen Splidsboel‚ Russian hybrid warfare: A study of disinformation (Danish Institute for International Studies, 2017).
[iv] Philipp Lorenz-Spreen and Stephan Lewandowsky et al., “How behavioural sciences can promote truth, autonomy and democratic discourse online,” Nature 4 (2020), https://www.nature.com/articles/s41562-020-0889-7;
Vartika Pundir and Elangbam Binodini Devi eds‚ “Arresting fake news sharing on social media: a theory of planned behavior approach,” Management Research Review 44, no. 8 (2021): 1108–1138;
Siddharth Ramalingam, “How To Fight Fake News With Behavioral Science,” The Decision Lab, 2022, https://thedecisionlab.com/insights/policy/how-to-fight-fake-news-with-behavioral-science/.
Zach Bastick, “Would you notice if fake news changed your behavior? An experiment on the unconscious effects of disinformation,” Computers in Human Behavior 116 (March 2021), https://doi.org/10.1016/j.chb.2020.106633.
[v] Alexandra Chesterfield, Alison Goldsworthy, and Laura Osborne, Poles Apart: Why People Turn Against Each Other, and How to Bring Them Together (London: Penguin UK, 2021).
[vi] Richard H. Thaler and Cass R. Sunstein, Nudge: The Final Edition (London: Penguin Books, 2009).
[vii] Ibid; Daniel Kahneman, Thinking, Fast and Slow (New York: Random House, 2011);
Richard H. Thaler, Misbehaving: The Making of Behavioral Economics (New York: W. W. Norton & Company, 2015).
[viii] Ramalingam, “How To Fight Fake.”
[ix] Skin resistance, previously known under the term GSR (galvanic skin response), is an old term that has not been used very often recently. However, new skin resistance tests are more focused on emotional arousal. In combination with brain reactions or face reactions, we can also see what combination of several methods is essential.
[x] Context is essential when assessing the same behavioural approach, where biases are differently used in different contexts. However, touchpoints are usually places of interaction between people and products, people and information. Relevant ones are the ones where we have higher chances to influence people. Not all touchpoints are equal; some people are more receptive, others less receptive.
[xi] Pierre Bourdieu, Theory of Behaviour (Prague: Karolinum), 4.
[xii] CDC, “Benefits of Getting a COVID-19 Vaccine,” 2022, https://www.cdc.gov/coronavirus/2019-ncov/vaccines/vaccine-benefits.html.
[xiii] Katherine L. Milkman and Mitesh S. Patel eds, “A megastudy of text-based nudges encouraging patients to get vaccinated at an upcoming doctor’s appointment,” PNAS 118, no. 20 (2021), https://www.pnas.org/content/118/20/e2101165118.
[xiv] David Halpern, Inside the Nudge Unit: How Small Changes Can Make a Big Difference (London: WH Allen, 2015);
Moreover, other similar conceptual frameworks are applied, e.g., in Eric Singler, Nudge marketing English Version: Winning at Behavioral Change (London: Pearson, 2015).
[xv] Gerulata, https://www.gerulata.com.
[xvi] Signal Analytics, http://www.signal-inc.com.
[xvii] Hengchen Dai and Silvia Saccardo et al., “Behavioural nudges increase COVID-19 vaccinations,” Nature 597 (2021), https://www.nature.com/articles/s41586-021-03843-2;
Md. Abul Kalam and Thomas P. Davis et al., “Exploring the behavioral determinants of COVID-19 vaccine acceptance among an urban population in Bangladesh: Implications for behavior change interventions,” PLoS One 16, no. 8 (2021), https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0256496;
Amnon Maltz and Adi Sarid, “Attractive Flu Shot: A Behavioral Approach to Increasing Influenza Vaccination Uptake Rates,” National Library of Medicine 40, no. 6 (August 2020), https://pubmed.ncbi.nlm.nih.gov/32772634/.
[xviii] Tony Yates, “Why is the government relying on nudge theory to fight coronavirus?” Guardian, 13 March 2020, https://www.theguardian.com/commentisfree/2020/mar/13/why-is-the-government-relying-on-nudge-theory-to-tackle-coronavirus.
[xix] APS, “A ‘Nudge’ May Not Be Enough to Counter Fake News Online,” 11 June 2021, https://www.psychologicalscience.org/news/releases/2021-june-nudge-fake-news.html.
[xx] NATO, “Public Affairs Handbook,” 2020, https://shape.nato.int/resources/3/website/PA_handbook.pdf.
[xxi] NATO, “NATO Allied Joint Doctrine For Psychological Operations,” 2007, https://info.publicintelligence.net/NATO-PSYOPS.pdf.
[xxii] NATO, “NATO Military Policy For Information Operations,” 2018, https://shape.nato.int/resources/3/images/2018/upcoming%20events/MC%20Draft_Info%20Ops.pdf.