NATO's response to the COVID-19 infodemic: The role of CIB simulations

This article will focus on NATO’s ability to counter COVID-19 disinformation, focusing on the approach Quirks takes. Quirks, a department of the Dutch counter-disinformation platform DROG, specialises in war-gaming and Coordinated Inauthentic Behaviour (CIB) simulations. DROG is most well-known for the viral game Bad News, which has been proven to increase the ability of players to recognise disinformation as shown in research by the University of Cambridge. Bad News engenders the basic tenet of all DROG’s projects and products: our preventative approach, or prebunking.

When it comes to addressing disinformation, there has always been a large focus on content and narratives. However, this approach leaves those involved several steps behind. Instead, DROG aims to focus on the techniques and tactics that are used in the creation of disinformation. This increases our ability to undertake preemptive steps to address these factors, which will ultimately lead to the safeguarding of democratic societies as a whole. In November 2020 DROG conducted a webinar series on the COVID-19 infodemic, in the fictive world of Botteborg for the Dutch general public. This article dives into the role that CIB simulations and interactive webinars as such can play in combating the COVID-19 infodemic and how these in turn can be used and provided by NATO to ultimately: Replay, research, resolve and rebuild.

 

Introduction

Disinformation is a term that has become more present over the last few years. (2016 elections, anyone?) However, the concept itself has been present for centuries. Though we can now hear children in the playground scream accusations of “fake news” against one another, the idea behind the term first gained its footing on the ground in ancient Greece.[i] Starting off as a term used to describe propaganda, its definition quickly expanded to include a wide variety of actions undertaken by individuals to (sometimes purposefully) mislead others, becoming more all-encompassing.

Fake news as a term came to the forefront after Craig Silverman from Buzzfeed used it in a tweet in 2014, and it later became more popular during the 2016 US elections.[ii] However, Silverman since regrets his use of the term stating: “I should have realized that any person, idea, or phrase - however neutral in its intention - could be twisted into a partisan cudgel”.[iii] This twist happened with the help of none other than former president of the United States Donald Trump. In the last five years few individuals have gained a greater following by using the term “fake news” than Trump.[iv] During his time in office the term was used on a regular basis to discredit news sources and win votes. However, his role ultimately resulted in polarisation on a large scale, within and outside of the US. A telling quote from Silverman’s article on the term states: “[...] today a phrase or image can come to mean anything you want it to, so long as you have enough followers, propagators, airtime, attention - and the ability to coordinate all of them”.[v]

The key word in the last sentence is coordinate. This ties into the focus of this article on Coordinated Inauthentic Behaviour (CIB) simulations in response to the COVID-19 Infodemic. In this context CIB is defined as “coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation”. This can be either domestic and non-related to governments or aimed at assisting government actors.[vi]

This article will argue in favour of the approach that the Dutch counter-disinformation platform DROG takes by focusing on preemptively countering tactics and techniques underlying the infodemic. Specifically, providing tools that contribute to prebunking, supporting the inoculation theory, and creating CIB simulations will effectively improve policy and increase citizens’ resilience against disinformation.

Before this article continues there should be a brief note on the difference between misinformation and disinformation. For this we shall take the same approach as Bernd Carsten Stahl and use the Oxford English Dictionary for support. The dictionary defines misinformation as “wrong or misleading information”.[vii] Whereas disinformation is “the dissemination of deliberately false information” and often focuses on the role of the government in this scenario.[viii] In the latter definition, the act of misleading is intentional. As mentioned at the start, this article will mainly focus on disinformation.

This article will now follow with a brief description of DROG, Quirks, and our reason for taking an alternative yet effective approach to countering disinformation preemptively.

 

Why you should avoid the narrative approach

DROG, well-known in the field of anti-disinformation for its award-winning game Bad News, has created an effective approach when it comes to countering disinformation. In discussions relating to disinformation, many parties have chosen to use fact-checking mechanisms. DROG argues that by taking this approach there is a larger focus on the narratives surrounding disinformation instead of the actual tactics and techniques that create it. In using fact-checking mechanisms, individuals will never be ahead of the game and will actually always be several steps behind the creators of disinformation. DROG chooses an alternative approach that focuses on tactics and techniques of disinformation and how to create societal resilience against this in a preemptive manner.

This particular approach is where the aforementioned game Bad News comes into play. In this game individuals of all ages are placed in the shoes of the “bad guy” and are encouraged to create their own disinformation. By actively participating in the game, the players gain more followers and can receive six separate badges related to specific tactics of disinformation (e.g., impersonation, polarisation, and several others).[ix] The game, as a preventative solution based on the inoculation theory, was researched by J. Roozenbeek and S. van der Linden of Cambridge University. The inoculation theory argues that individuals can build “mental antibodies”, similar to the antibodies that are created in vaccines that use a lower dose of a virus.[x] As Roozenbeek and van der Linden state: “[...] by preemptively exposing people to a weakened version of a (counter)-argument, and by subsequently refuting that argument, attitudinal resistance can be conferred against future persuasion attempts”.[xi] More in-depth research from Cambridge University concluded that the game was effective in inoculating civilians against disinformation by having them play the Bad News game and, therefore, increasing their resilience against disinformation going forward.[xii] This process of creating “mental antibodies” can also be called “cognitive immunity” and leads to what Roozenbeek and van der Linden call prebunking (i.e., “preventative strategies against the spread of disinformation”).[xiii]

However, the Bad News game is not the only solution DROG offers. In addition to serious gaming DROG has a wide variety of programmes that range from the monitoring and analysing of disinformation to the replaying of disinformation datasets in CIB simulations to test and effectively improve (international) policy where necessary.[xiv] The latter is offered by Quirks in a sandbox environment built within Mastodon, an existing social network that functions similar to Twitter. This safe training and testing environment provides insights that otherwise would be challenging to gather. This approach will be further discussed in the solution chapter of this paper.

 

The COVID-19 Infodemic: What does it mean and why is it important

The outbreak of COVID-19 has led to the creation of an enormous amount of disinformation on an international scale. A majority of the misleading information has been spread via social media. Almost everyone has received a message or encountered a post via social media stating that “a friend of a friend’s cousin said x, y, z” about the virus or the way in which it is spread. The messages so far have caused stress, uncertainty, and long-term damage to public health.[xv] The World Health Organization (WHO) defines an infodemic as: “too much information including false or misleading information in digital and physical environments during a disease outbreak. It causes confusion and risk-taking behaviors that can harm health”.[xvi]

A few examples of misleading information stated that the virus was created in a lab in Wuhan, China, and that the 5G network can lead to COVID-19.[xvii] Another telling example are the theories surrounding Bill Gates, who is supposedly using the COVID-19 vaccine to insert microchips into the receivers of said vaccines.[xviii] This is a lot of disinformation to tackle, and research relating to COVID-19 fact-checking has once again proven the inefficiency of fact-checking mechanisms at large.[xix] Measures taken by Facebook to fact-check information did not stop conspiracies from spreading on their platform. Even after debunking the false information, there is a strong likelihood of the affected individuals will continue to believe the information.[xx]

To help combat the COVID-19 infodemic and support public resilience, DROG and its serious gaming department shifted the scenario from the original Bad News game to include storylines on virus conspiracies and even created a variant called GO VIRAL! in collaboration with Cambridge University and the UK Cabinet Office.[xxi] Interestingly, van der Linden and Roozenbeek argue that if a large number of people play either of the aforementioned games, “social herd immunity” can be achieved.[xxii] This means that the positive impact of the games reaches a lot further than just the specific individuals that actively participated.

Though this method has proven to be extremely successful, DROG provides several other solutions to combat disinformation and in particular the COVID-19 infodemic. These alternatives shall be addressed in the final section of this paper and include the role of a multi-stakeholder approach and its correlating benefits for improving and testing policy relating to countering disinformation.

 

Our Solution

As discussed throughout this article, DROG presents various solutions when it comes to countering disinformation. The aforementioned project Quirks in particular focuses on CIB simulations. A simulation of a disinformation campaign gives a deeper understanding of the causes and effects of specific events in real time. The simulations take place in Mastodon, an existing social media platform on which we have based our sandboxed environment. This enables us to upload existing datasets on a variety of topics and provides participants with the opportunity to experience and learn from them in a safe space.

The simulations are useful both for the general public and policymakers that focus on disinformation. Policymakers can test current and future policies to ensure efficacy and to make adjustments where necessary. There is also the possibility to replay past events to gather new insights and develop hands-on tools to combat disinformation. Another direction that the simulations can take is that of war-gaming. Participants can be divided into blue and red teams and take on the task of distributor of disinformation (red) and agents countering disinformation (blue). Similar to the Bad News game, by temporarily taking on the role of the bad guy in these simulations, this new perspective can improve the quality of anti-disinformation mechanisms used. This approach can be of particular interest to NATO given its use of war-gaming to improve its day-to-day work.

However, Quirks’ use of simulations is not only useful for high-level policymakers but can also be directed towards the general public. In November 2020 we organised a webinar series with the US Embassy in The Hague, specifically on the COVID-19 infodemic. In the webinars participants were given a personalised passport to our (fictive) digital world of Botteborg that was experiencing a pandemic similar to the one we currently face. With this passport participants could access Mastodon and complete specific assignments that, amongst other things, required them to flag any accounts or toots (Mastodon equivalent of tweets) that they deemed to contain false information. These assignments were completed with the help of a storyline they could refer to in a four-part series on Youtube. Each part focused on a different area in which disinformation often comes to the forefront, such as viral messaging and state actor activity. This particular set-up for a simulation in smaller doses was inspired both by the Bad News and Go Viral! games in taking an approach that fits the inoculation theory used in both cases. By providing this type of simulation, the general public gets to practice their approach when it comes to recognising disinformation in a safe space that is mirrored to real life. 

Another specific activity from DROG that closely relates to NATO is a joint project with the Atlantic Forum that uses peer education in European classrooms to teach teenagers how to recognise disinformation with the Bad News approach and to increase their resilience.[xxiii] This project is based on Under Pressure, another serious game provided by Bad News, the DROG gaming department.

Going forward, Quirks will continue DROG’s collaboration with parties such as Cambridge University. It also intends to continue development and collaboration with partners such as Microsoft, the US State Department, and the European Commission. This public-private partnership approach provides space for community-driven research and creates an ecosystem in which cutting-edge initiatives can grow from a holistic perspective, whilst ensuring that all bases are covered when it comes to countering disinformation and providing the right solutions.  

 

Conclusion

This article has examined the topic of the COVID-19 infodemic and countering disinformation from the unique perspective and approach offered by Dutch counter-disinformation platform DROG. It has looked into the role that serious games offer in today’s society and how the games Bad News and Go Viral! have successfully resulted in individuals becoming more resilient against disinformation.[xxiv] This approach of introducing participants to tactics and techniques of disinformation in small doses to make them more equipped in recognising it in the long run is linked to the inoculation theory and something DROG likes to call prebunking.

To answer the underlying question to this paper: For NATO to successfully respond to this ongoing infodemic, measures similar to the ones offered by DROG need to be implemented. By shifting focus away from the narratives towards the technical side and including individuals of all levels, from high-level officers to the general public, NATO can have a positive impact in preemptively combating the infodemic.[xxv]

As argued in this paper, DROG believes that the only answer to the infodemic we are currently facing is taking preventative measures. DROG offers this in the form of serious games, simulations, and with the help of our monitoring and analysis tools. The public-private partnership approach and the network DROG has established around this helps them succeed in this goal.

 

Notes

[i] Peter S. Field, “Fake news was a thing long before Donald Trump - just ask the ancient Greeks”, The Conversation, February 25 2021, https://theconversation.com/fake-news-was-a-thing-long-before-donald-tru...

[ii] Craig Silverman, “I helped popularize the term fake news and now I cringe”, Buzzfeed News, December 31 2017,  https://www.buzzfeednews.com/article/craigsilverman/i-helped-popularize-....

[iii] Silverman, “I popularized”. 

[iv] Mike Wendling, “The (almost) complete history of fake news”, BBC, January 22, 2018, https://www.bbc.co.uk/news/blogs-trending-42724320.

[v] Silverman, “I popularized”.

[vi] “What is CIB?” December 2020 Coordinated Inauthentic Behaviour Report, Facebook, accessed  March 3, 2021, https://about.fb.com/news/2021/01/december-2020-coordinated-inauthentic-...

[vii] Bernd Carsten Stahl, “On the Difference or Equality of Information, Misinformation, and Disinformation: A Critical Research Perspective”, Informing Science Journal, Volume 9 (2006): 86.

[viii] Stahl, “On the Difference”, 86.

[ix] “Bad News”, Bad News, accessed March 3, 2021, https://www.getbadnews.com.

[x] Jon Roozenbeek and Sander van der Linden, “The Fake News Game: actively inoculating against the risk of misinformation”, Journal of Risk Research, (February 2018): 2.

[xi] Ibid., 2.

[xii] Ibid., 7.

[xiii] Melisa Basol, Jon Roozenbeek, and Sander van der Linden, “Good News about Bad News: Gamified Inoculation Boosts Confidence and Cognitive Immunity Against Fake News”, Journal of Cognition 3, no. 1: (2020), 1.

[xiv] DROG recently released two new games: one called Harmony Square that focuses on disinformation surrounding elections and another game called Go Viral that addresses COVID-19 conspiracies.

[xv] Sander van der Linden, Jon Roozenbeek, and Josh Compton, “Inoculating Against Fake News About COVID-19”, Frontiers in Psychology, 11:566790 (October 2020).           

[xvi] “Overview”, The COVID-19 Infodemic, WHO, accessed March 3 2021, https://www.who.int/health-topics/infodemic/the-covid-19-infodemic#tab=t....

[xvii] van der Linden, Roozenbeek, and Compton, 2.

[xviii] Ibid., 2.

[xix] Ibid., 2. 

[xx] Ibid., 2.

[xxi] Ibid., 4; “Go Viral!” Bad News, accessed March 3, 2021,  https://www.goviralgame.com/en .

[xxii] Ibid., 4.

[xxiii] This is specifically based on the project called Under Pressure, for more information check: https://www.getunderpressure.com/.

[xxiv] van der Linden, Roozenbeek and Compton, 2020.

[xxv] Furthermore, local contexts need to be taken into account before implementing any strategies. Quirks will always argue in favour of working with local partners.

Image source: https://www.nato.int/cps/fr/natohq/news_180639.htm?selectedLocale=en

Evangeline Verstraelen

Evangeline Verstraelen is the Program Director of Quirks, the war-gaming and simulation department of the Dutch counter-disinformation platform DROG. The platform, located in The Hague, takes a multi-stakeholder and multidisciplinary approach when it comes to countering disinformation. It aims to focus on the tactics and techniques underlying disinformation instead of the narratives. Using this approach, DROG stays ahead of the game instead of—quite literally—dealing with disinformation after the fact. In addition to these simulations, DROG also has gamified inoculation interventions under the header Bad News. More recent games have focused on electoral disinformation and the COVID-19 infodemic. Furthermore, DROG’s Forensic Journalism monitoring unit monitors elections and is currently focusing on the Dutch elections of March 2021. Everything we do is analysed by using our “black box” method and collected in our inventory.

Previous
Previous

Recognizing the sixth front: The cognitive domain and disinformation during the COVID-19 pandemic

Next
Next

Empowering itself and countering its rivals: A dual crisis management strategy for NATO