Building Societal Resilience to Disinformation

By Besnik Toverlani

The spread of disinformation is one of the largest threats to democracies and populations, especially with the increased development of technology and loopholes allowing its spread. As such, it has become increasingly important to address such an issue through a whole-of-society approach, building society’s resilience to disinformation. Recently, disinformation has been mentioned as part of the new strategic environment in NATO’s new Strategic Concept, elaborating how it is one of the methods used by authoritarian actors, challenging NATO’s values, interests, and way of life.[i] Building resilience increases the stability of societies, aids healthy democratic processes, and aids good governance, which is why NATO’s Resilience Committee has formed the Civil Communications Planning Group (CCPG) and the Civil Protection Group (CPG), both of whom target disinformation in their areas of work.[ii] This article will first explain the general importance and possible impact of disinformation. Then, it will discuss the loopholes that can be misused to allow its spread legally, beginning with the Universal Declaration of Human Rights in 1948 and the use of Warning Labels on pages that are likely to spread disinformation. Furthermore, it will explain the rapidity of the spread of false news and the means by which it is spread, explaining the logic behind deploying bots and addressing their success in spreading disinformation. The notion of Synthetic Media and how easy it has become to manipulate information will be further addressed, referencing recent cases of disinformation campaigns with a focus on the Russia-Ukraine war. Finally, the importance of pre-emption of disinformation will be explained, and the opportunities for us to build societal resilience to disinformation will be outlined.

The rise and risks of disinformation

The progressive evolution of media and the increasing speed of the spread of information globally are as helpful as they are worrying. For example, in 2005, an earthquake with its epicentre in Costa Rica shook the ground and was felt all the way up to Managua, Nicaragua. However, despite the earthquake hitting Managua 60 seconds after hitting its epicentre, it only took 30 seconds for the first internet post that spread the information about the earthquake not only to the people of Managua but “all over the world.”[iii] Though this information might be helpful to the people in the affected areas, it is important to acknowledge that in the modern world, the spread of disinformation is no slower than the spread of information.

Disinformation is defined as the deliberate distribution of false information, distinct from propaganda, which is spread deliberately to elicit desired responses from the public.[iv] However, often, disinformation is used in a manner that qualifies it as propaganda. In other sources, the use of propaganda is also known as well-organized disinformation; therefore, it is not rare that there is a conflation between the two. Weedon, Nuland, and Stamos distinguish between chaotic disinformation and well-organized disinformation, claiming that the second is powerful enough to disrupt governance and campaigns in countries.[v] Chomsky and others view it as a conventionally political tool, e.g., its use by Woodrow Wilson in 1916 to turn a pacifist American community into a war-mongering population.[vi]

To understand the level of threat that disinformation can pose and has posed, Michael Morell, former acting director of the CIA, referred to Russia’s impact on the 2016 U.S. election as “the political equivalent of 9/11.”[vii] Regarding this case, Robert Mueller filed a report with the Attorney General explaining how Russians had meddled in the 2016 election. In the report, he mentions that Russia carried out diverse operations through social media—mainly targeted disinformation campaigns, hacking operations, and other operations that were carried out in person.[viii] The term they used was “information warfare,” combining information that supports one candidate and information that disparages the other candidate.[ix] Other examples of disinformation campaigns directed towards NATO and its Allies are the claim that Canadian troops in Latvia introduced COVID-19 to Latvia; the forged letter from NATO’s Secretary General claiming that NATO would withdraw its troops from Lithuania due to COVID-19; and the claims that COVID-19 was actually created in NATO labs as a biological weapon.[x]

In NATO’s Strengthened Resilience Commitment, disinformation is listed among the threats and challenges to NATO resilience, as it is “aimed at destabilising our societies and undermining our shared values; and attempts to interfere with our democratic processes and good governance.”[xi] Sources of disinformation can be numerous, beginning with rumours, governments, media, vested interests, and NGOs.[xii] Through the development of technology and the increase in access to the internet, the spread of disinformation is used as a means to reach an end that is not necessarily political. Therefore, it is important to acknowledge the impact that the spread of disinformation might have on society. The development of a myriad of means of information and the ease of access has limited the power of state-held media in democratic states. Therefore, political powers and non-state actors are increasingly using social media and other independent media to advance their personal and political agendas. However, even after acknowledging increased risk, there are limits on a government’s actions to fight the spread of propaganda.

Freedom of expression and legal implications

Disinformation can be seen as one of the main enemies of democracy; however, there are also loopholes in democratic systems that can be used to allow the spread of disinformation and to limit actions by state and non-state actors to prohibit the quick spread of fake news. After all, “the internet does not adhere to typical standards of the truth, scientific inquiry, and evidence-based news and information.”[xiii]

The impact of the Universal Declaration of Human Rights (UDHR) in 1948 was crucial to the spread of democracy throughout the world. The quest for equal recognition of human rights around the world perfectly aligned with the main elements of democracy. Examining Article 21 of the UDHR, we see that although democracy has not necessarily been mentioned, democratic principles are seen as fundamental to human rights. Therefore, it can be said that the Declaration, even though not legally binding in itself, lays the foundation for appropriate human behaviour, listing democracy as the standard to be achieved worldwide.

However, increased focus should be on Article 19 of the UDHR, which protects the right of expression. The article states:

Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.

As seen in Article 19, all people should be able to exercise the right to have an opinion, express that opinion, and finally, spread that opinion.[xiv] However, in no part of the article does it say that that opinion should be true, and accurate. This is only limited by the International Covenant on Civil and Political Rights (ICCPR), which requires nations to respect the reputation and the rights of others and in protecting public order and national security. Furthermore, in the 43rd Session of the Office of the High Commissioner for Human Rights (OHCHR), a report was made which indicated that states were using censorship methods in order to stop electoral rigging and stop the influence of outside actors upon a state’s elections. In their joint declaration in 2017, the members of the session identified this as a threat to democracy, considering that “general prohibitions on the dissemination of information based on vague and ambiguous ideas, including ‘false news’ or ‘non-objective information’, are incompatible with international standards for restrictions on freedom of expression [...] and should be abolished.”[xv] That is, there is risk in addressing the issue of disinformation in that a generalization of the term could in fact harm democracy, which aids “disinformants” as it limits governments in the actions that can be taken to face such an issue.

Methods of spreading disinformation

In a study conducted by Menczer and Hills in 2020 that focused on the cognitive biases of individuals, it was determined that one of the most important factors that aids the spread of disinformation is information overload. Through several experiments conducted by their OSoMe team, they determined that due to the shortened attention span of individuals in today’s society as well as the inability to look at all of the information presented in people’s newsfeed, we tend to believe the information that is served to us first, which thus leads us to only read and believe the few items that appear in the beginning of our newsfeed.[xvi]

Through OSoMe, Menczer and Hills also established the importance of bots in social media, which led to the faster spread of disinformation through society. Social media tends to prioritize more liked posts, thus putting posts with more likes first in users’ newsfeed rather than posts with fewer likes. This algorithm was used in the 2016 elections as well; OSoMe estimated that almost 15 percent of accounts that were active in Twitter were actually bots, which were used to aid the spread of disinformation through retweets, shares, and likes.[xvii] According to Caroline Orr, a behavioural scientist, this trend can also be known as “the illusion of popularity,” which has in the past and still can make people believe a certain fact only based on its popularity.[xviii] One of the most recent examples is the widespread picture of an alleged explosion in the Pentagon on 22 May 2023, which spread through multiple bot accounts and was reposted by official foreign media pages on Twitter until it was identified as being AI-generated.

Deepfakes, cheap fakes, or “shallow fakes” all fall into the category of Synthetic Media, which is described as media created or altered through Artificial Intelligence or that has been manipulated through technology, analogue or digital.[xix] This media is mainly spread through social media, after which it is often picked up by official media channels. The difference between the two is that deep fakes use more advanced machine learning technology such as face swaps, through which an individual can “appear” in places where they have not been, or lip syncing, where the given speech can be altered or a new speech can be generated through AI; whereas cheapfakes are more easily created and include cases such as slowing speech, cutting videos and/or changing the order of sentences in order to alter the meaning of speech, or accelerating videos. This, combined with the quick spread of false information, is a grave challenge to today’s society, which is increasingly suspectable to false information and for which it is becoming increasingly difficult to be able to differentiate between what is true and what is not true. As such, today, social media is not only seen as a form a connection for all people around the world but also as a passage for the spread of fake news for the benefit of malign individuals or groups.

Difficulties in the war against disinformation

In a 2018 study conducted at MIT, Sinan Aral, professor and co-author of the study, concluded that “falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude.”[xx] The study also concluded that a false story reaches an audience of 1,500 people six times quicker than a true story. As such, even if we work on the debunking of false information, we have no guarantee that this information will reach the same audiences that the false information initially reached. And thus, it has become more and more difficult for the average citizen to make a distinction between fake news and news that is real, especially with the advancement of Synthetic Media.

Furthermore, it is important to understand that simple debunking, which is one of the most often used techniques, is not considered an effective approach to building resilience. According to Nathan Walter, a Northwestern University Professor studying the correction of misinformation through meta-analysis, even if there are systems in place that counter disinformation through means of debunking or correcting falsehoods, these systems still do not entirely eliminate the effect of misinformation as false information continues to reside in an individual’s memory and fact-checking may fade through time, leaving us with the memory of the false information that we absorbed first. Furthermore, the article also established that it is not certain that the corrected information reaches the exact audience that the false information has reached.[xxi] However, if debunking the initial information is the determined approach, in order for it to be effective, there needs to be an alternative explanation: that is, not only a plausible explanation but also one that explains why the initial information was thought to be true.

Since the Russian invasion of Ukraine, the Russian spread of disinformation has increased significantly, with the aim of causing confusion, building support for Russian objectives, and targeting the legitimacy of the Ukrainian response against Russia. This is done through a combination of information spread through official and unofficial channels, especially using Twitter, TikTok, Instagram, Youtube, and Facebook, as well as commenting in other media outlets. Such disinformation campaigns were prevalent even before the war. Professor Scott Radnitz explains that “Russia and other post-Soviet states are also prone to claim a ‘provocation,’ which frames any military action as a justified response rather than a first move.”[xxii] Considering the incredibly large flow of information online regarding the conflict, Justin Pelletier, Professor at Rochester Institute of Technology, writes, “This underscores how difficult it is to be certain of the truth with a high volume of fast-changing information in an emotionally charged, high-stakes situation like warfare.”[xxiii]

At the same time, social media such as Twitter has begun putting Warning Labels on possible known disinformation-spreading accounts; however, “these rules do not apply to government-controlled accounts not labelled as media, such as foreign embassies,”[xxiv] which increases the possibility of these accounts serving as hubs for disinformation. One such example is Russia’s 75 government accounts, which posted 1,157 tweets between 25 February and 3 March 2022, a majority of which were about Ukraine, and were focused on the justification of the invasion.

Building Societal Resilience to Disinformation

Building resilience to disinformation requires a “whole-of-government and a whole-of-society” approach. The UN Resolution, adopted by the General Assembly on Global Media and Information Literacy Week, encouraged states to “develop and implement policies, action plans and strategies related to the promotion of media and information literacy, and to increase awareness, capacity for prevention and resilience to disinformation and misinformation, as appropriate.”[xxv]

One of the pillars of building societal resilience to disinformation is the promotion and enhancement of media and information literacy. Ipsos conducted a survey in 2021 with participants from 11 countries in Europe, where only 9% of the participants had ever participated in any form of learning programs designed to teach the use of online tools to distinguish between true and false information.[xxvi] Education on media literacy should be a daily and constantly progressing endeavour, however, in order to teach society, we first need to have trained and professional educators. Therefore, programs such as the Civic Online Reasoning project by Stanford, which is focused on helping educators teach students the techniques employed by fact-checkers to determine the trustworthiness and credibility of online sources, are one of the basic requirements in order to begin building resilience to disinformation.[xxvii] However, considering the diverse societies that we reside in, it is important to acknowledge that media literacy should reach, if possible, all of society; therefore, there needs to be diverse ways of teaching media literacy, not only focused on schools and universities. The Latvian Government, as an example of a whole-of-government and whole-of-society approach, has increased the number of stakeholders in its guidelines and opened the path for most of its ministries to be included in the process of increasing media literacy.[xxviii] In a NATO setting, NATO’s 2030 Young Leaders recommended that Alliance-wide exercises be held to increase reaction efficiency to disinformation, further help identify it, and also train non-specialised personnel to aid in the identification of disinformation at the local level.[xxix]

Another method that can be used to increase society’s resilience to disinformation is called pre-bunking. Rather than debunking fake information, pre-bunking is done by predicting techniques that will be used to spread disinformation and informing people about those techniques. This is also called “inoculation theory”.[xxx] Jon Roozenbeek and others led seven different studies where they created a set of inoculation videos, which described the common manipulation techniques that are used to spread disinformation. Through the studies, they were able to determine that watching such videos increased the ability of the subject to identify manipulation techniques that are often used in spreading false information. The videos were shown as ads in platforms such as Youtube, Twitter, TikTok, and Meta.[xxxi] Another approach to pre-bunking was used by the Dutch platform DROG, which used games, such as the Bad News game, where individuals are the creators of disinformation. This method can be particularly effective, as it places the individual in a position through which he can more deeply understand the cause, effects, and methods of spreading disinformation. Through the game, the individuals created “mental antibodies”, thereby creating attitudinal resistance against future attempts to persuade them.[xxxii] The inoculation theory could be used by governments after establishing common disinformation methods used in their territories in order to build resilience and help the population identify false information, as it can be largely vulnerable to disinformation.[xxxiii]

Lastly, building societal resilience to disinformation can drastically lower the impact of disinformation in society and in the state. However, it is an ongoing process that requires continuous learning and active engagement. A collective approach should be used among governments, non-governmental organizations, academia, journalists, disinformation experts, and others, in order to ensure that there is an ongoing attempt to battle disinformation campaigns. The strategies and policies developed by NATO and its allies should include active engagement in all areas, including, but not limited to increasing media literacy, increasing awareness, fact-checking, inoculation or pre-bunking, and when needed, debunking information accurately. In a time where disinformation has the power to disrupt entire societies and undermine trust, implementing such policies is not only beneficial for the individual but also for the well-being of society and democracy.

 

About the Author

Besnik Toverlani is a 2nd Lieutenant in the Kosovo Security Force and an MSc graduate from Rochester Institute of Technology in Kosovo, where he focused his studies on the historical use of narrative related to Kosovo and Serbia. Currently, his main interests are disinformation, propaganda, and hybrid warfare.

 

Notes

[i] NATO, “NATO 2022 Strategic Concept,” 29 June 2022, https://www.nato.int/cps/en/natohq/topics_56626.htm

[ii] NATO, “NATO Resilience Committee,” 2022, https://www.nato.int/cps/en/natolive/topics_50093.htm

[iii] Markham Nolan, “How to separate fact and fiction online,” 11 December 2012, YouTube, https://www.youtube.com/watch?v=sNV4yIyXXX0

[iv] John Hopkins University, “Guides: Evaluating Information: Propaganda, Misinformation, Disinformation,” 2023, https://guides.library.jhu.edu/evaluate/propaganda-vs-misinformation

[v] J. Weedon, W. Nuland, and A. Stamos, “Information operations and Facebook,” 2017, https://i2.res.24o.it/pdf2010/Editrice/ILSOLE24ORE/ILSOLE24ORE/Online/_Oggetti_Embedded/Documenti/2017/04/28/facebook-and-information-operations-v1.pdf

[vi] Noam Chomsky, Media Control – The Spectacular Achievements of Propaganda (New York: Seven Stories Press, 2002).

[vii] Michael Morell and Suzanne Kelly, “Fmr. CIA Acting Dir. Michael Morell: ‘This Is the Political Equivalent of 9/11,’” The Cipher Brief, 11 December 2016, https://www.thecipherbrief.com/fmr-cia-acting-dir-michael-morell-political-equivalent-911-1091

[viii] Robert Mueller, “Report On The Investigation Into Russian Interference In The 2016 Presidential Election,” U.S. Department of Justice, March 2019, https://www.justice.gov/archives/sco/file/1373816/download

[ix] Ibid.

[x] NATO, “NATO’s Approach to countering disinformation: a focus on COVID-19,” July 2020, https://www.nato.int/cps/en/natohq/177273.htm

[xi] NATO, “NATO Resilience Committee,” 2022, https://www.nato.int/cps/en/natolive/topics_50093.htm

[xii] Stephan Lewandowsky, Ullrich K. H. Ecker, and John Cook, “Misinformation and Its Correction: Continued Influence and Successful Debiasing,” Psychological Science in the Public Interest 13, no. 3 (September 2012): 106–131, https://doi.org/10.1177/1529100612451018

[xiii] Anya Schiffrin, “Disinformation and Democracy: The internet transformed protest but did not improve democracy,” Journal of International Affairs 71, no. 1 (2017): 117–126, https://www.jstor.org/stable/26494367

[xiv] Universal Declaration of Human Rights, 1948.

[xv] Office of the High Commissioner for Human Rights, JOINT DECLARATION ON FREEDOM OF EXPRESSION AND “FAKE NEWS,” DISINFORMATION AND PROPAGANDA, (2017).

[xvi] Filip Menczer and Thomas Hills, “Information Overload Helps Fake News Spread, and Social Media Knows It,” Scientific American (2020), https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

[xvii] Ibid.

[xviii] Caroline Orr, “Pro-Trump & Russian-Linked Twitter Accounts Are Posing As Ex-Democrats In New Astroturfed Movement,” Arc Digital, 5 July 2018, https://medium.com/arc-digital/pro-trump-russian-linked-twitter-accounts-are-posing-as-ex-democrats-in-new-astroturfed-movement-20359c1906d3

[xix] Department of Homeland Security, “Increasing Threat of Deepfake Identities,” 2021, https://www.dhs.gov/sites/default/files/publications/increasing_threats_of_deepfake_identities_0.pdf

[xx] Sinan Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359 (2018): 1146–1151, DOI:10.1126/science.aap9559

[xxi] Nathan Walter and Sheila T. Murphy, “How to unring the bell: A meta-analytic approach to correction of misinformation,” Communication Monographs 85, no. 3 (2018), https://doi.org/10.1080/03637751.2018.1467564

[xxii] Scott Radnitz, “What are false flag attacks – and did Russia stage any to claim justification for invading Ukraine,” The Conversation, 14 February 2022, https://theconversation.com/what-are-false-flag-attacks-and-did-russia-stage-any-to-claim-justification-for-invading-ukraine-177879

[xxiii] Justin Pelletier, “Intelligence, information warfare, cyber warfare, electronic warfare – what they are and how Russia is using them in Ukraine,” The Conversation, 1 March 2022, https://theconversation.com/intelligence-information-warfare-cyber-warfare-electronic-warfare-what-they-are-and-how-russia-is-using-them-in-ukraine-177899

[xxiv] Timothy Graham and Jay Daniel Thompson, “Russian government accounts are using a Twitter loophole to spread disinformation,” The Conversation, 15 March 2022, https://theconversation.com/russian-government-accounts-are-using-a-twitter-loophole-to-spread-disinformation-178001

[xxv] UN General Assembly Resolution 75/267, “Global Media and Information Literacy Week,” 25 March 2021.

[xxvi] Ipsos, “Online media literacy: Across the world, demand for training is going unmet,” 15 March 2021, https://www.ipsos.com/en-uk/online-media-literacy-across-world-demand-training-going-unmet

[xxvii] Stanford, “Civic Online Reasoning,” https://cor.stanford.edu/

[xxviii] Alina Clay, “Assessing the significance of media literacy in Latvia: A critical tool of societal resilience,” Latvijas Arpolitikas Instituts, 3 April 2018, https://www.lai.lv/viedokli/assessing-the-significance-of-media-literacy-in-latvia-a-critical-tool-of-societal-resilience-687

[xxix] NATO 2030 Young Leaders Group, “NATO 2030: Embrace the change, guard the values,” https://www.nato.int/nato_static_fl2014/assets/pdf/2021/2/pdf/210204-NATO2030-YoungLeadersReport.pdf

[xxx] Jon Roozenbeek, Sander Van Der Linken, Beth Goldberg, Steve Rathje, and Stephan Lewandowsky, “Psychological inoculation improves resilience against misinformation on social media” Science 8, no. 34 (24 August 2022), DOI: 10.1126/sciadv.abo6254

[xxxi] Supantha Mukherjee, “Google to roll out anti-disinformation campaign in some EU countries,” Reuters, 25 August 2022, https://www.reuters.com/business/media-telecom/google-roll-out-anti-disinformation-campaign-some-eu-countries-2022-08-24/

[xxxii] Evangeline Verstraelen, “NATO’s response to the COVID-19 infodemic: The role of CIB simulations,” Atlantica, 1 April 2021, https://www.atlantic-forum.com/atlantica/natos-response-to-the-covid-19-infodemic-the-role-of-cib-simulations

[xxxiii] Jon Roozenbeek, Sander Van Der Linken, Beth Goldberg, Steve Rathje, and Stephan Lewandowsky, “Psychological inoculation improves resilience against misinformation on social media” Science 8, no. 34 (24 August 2022), DOI: 10.1126/sciadv.abo6254

Image: https://www.nato.int/docu/review/articles/2021/08/12/countering-disinformation-improving-the-alliances-digital-resilience/index.html

Previous
Previous

The Importance of Energy Resilience in Multi-Domain Threats

Next
Next

Humanitarian Assistance, Disaster Relief and Good Governance: Short-Term Presence vs Long-Term Effects