Combating Disinformation: Policy Analysis and Recommendations for the 21st Century Information Environment
Information in the 21st century spreads incredibly quickly. The combined powers of social media, the 24-hour news cycle, and the internet at large give organizations the ability to quickly and cheaply spread information to previously unimaginable audiences with limited oversight. These tools have significantly increased the effectiveness of disinformation campaigns against Western liberal democratic nations. Furthermore, there exists significant evidence that, by leveraging these tools, foreign actors have conducted disinformation campaigns against Western liberal democratic nations. Building upon that, there is also evidence suggesting that current disinformation campaigns may be reducing public faith in democratic institutions, exasperating existing social tensions, and unduly influencing the outcome of elections. Given the increasing effectiveness of disinformation campaigns, fighting disinformation is more important than ever. National security organizations from the United States and across the world are taking steps towards combating disinformation with varying degrees of success. Based on an analysis of current US policy, there are several counter disinformation policy gaps. To address these deficiencies, this paper suggests three key policy recommendations that would significantly boost the ability of the US to counter the threat of disinformation in the 21st century information environment.
By Alexander G. Fremis*
“[Dezinformatsiya is] like cocaine. If you sniff it once or twice, it may not change your life. If you use it everyday though, it will make you an addict – a different man.”[i]
Yuri Andropov
Soviet Leader and former Chairman of the KGB
Introduction
In today’s world of social media, cell phones, and “fake news”, disinformation is a significant concern for policymakers from the US and across the world. The internet, and especially social media, have introduced a new, low cost, highly effective medium of communication to the world. With this, individuals and organizations now have the ability to communicate with potentially millions of people at the click of a button and at little or no financial cost. While there are many positive aspects to the quick and open sharing of information and ideas across the world, evidence exists that foreign actors and terrorist groups have leveraged these same social media platforms in order to propagate disinformation in Western liberal democratic nations.[ii], [iii] The 2014 Russian annexation of Crimea, Brexit, and the 2016 United States presidential election have sparked significant debates among national security scholars and policymakers in the US and internationally.[iv] While some attempts have been made at establishing effective counter disinformation policies, current US domestic counter disinformation policy as well as its participation in international counter disinformation efforts have certain weaknesses. As such, this paper will present three key recommendations to US national security policymakers for addressing these deficiencies and improving the ability of the US to respond to these new threats in the 21st century information environment.
What is Disinformation?
While no universally accepted definition exists for disinformation, based on available academic literature and government reporting, disinformation can be generally defined as the deliberate dissemination of information which is false or misleading in the pursuit of strategic, political, or social objectives through the distortion of reality.[v], [vi], [vii] Disinformation can take many forms and has historically been promulgated through a wide variety of mediums including theatre performances, movies, print articles, and academic research.[viii] In recent years, social media posts and the internet have revolutionized propaganda and disinformation and now arguably make up the primary medium through which they are spread.[ix]
Although the study of disinformation has seen renewed interest from academics and national security policy makers in the wake of the 2016 presidential election, the US and the West as a whole have been the targets of disinformation campaigns for many years. In fact, significant evidence exists suggesting that disinformation was a tool of Russian and Soviet foreign policy throughout the 20th century. “Dezinformatsiya,” as it is known in Russian, was a well-recognized tactic of Soviet national security and intelligence organizations such as the KGB and played a key role throughout the Cold War.[x] These campaigns were leveraged by the Soviet bloc in the hopes of “changing the perception of reality of every [Westerner] to such an extent that, despite the abundance of [true] information, no one is able to come to sensible conclusions in the interest of defending themselves, their families, their communities, and their country.”[xi]
While evidence exists suggesting that disinformation is distinct from other forms of information operations, there exists no single universally accepted definition to describe disinformation. Although the terms are often used interchangeably, disinformation is distinct from misinformation in that disinformation requires the promulgator(s) to be distinctly aware that the information they are sharing is false.[xii], [xiii] As such, disinformation shared by ignorant third parties is categorized as misinformation due to the lack of necessarily malicious intent. Furthermore, despite its similarities, disinformation is distinct yet related to historical propaganda and psychological operations. Propaganda is defined by NATO as “Information, especially of a biased or misleading nature, used to promote a political cause or point of view.”[xiv] Although disinformation and propaganda share the information environment in which they both operate alongside inherent biases and misleading natures, this definition does not describe the full effects of disinformation, which go above and beyond ideological persuasions. Psychological operations (psyops) are defined as “Planned activities using methods of communication and other means directed at approved audiences in order to influence perceptions, attitudes and behavior, affecting the achievement of political and military objectives.”[xv] Although somewhat accurate as a tool for describing disinformation, psyops once again does not completely describe the strategic implications of disinformation within the information environment. Furthermore, (dis)information in the 21st century is inherently difficult to control and cannot be effectively contained within “approved audiences.” On the contrary, as will be discussed later, social media and the internet at large has transformed disinformation in the 21st into an inherently borderless threat. Within the context of these definitions, disinformation can be best conceptualized as being at the intersection of propaganda and psyops. That is to say, disinformation campaigns are, at their core, a psychological operation conducted within the information environment.
Disinformation campaigns have significantly evolved over the course of the Cold War and into the 21st century, and no two disinformation campaigns are exactly alike. That being said, as outlined by former senior Soviet bloc Intelligence Officer, Ion Pacepa[xvi] and others,[xvii], [xviii] effective disinformation campaigns all share several key similarities between them. Firstly, effective disinformation campaigns require a “kernel of truth” that provides the base from which deception can be founded. Secondly, the disinformation must appear to originate from a trusted (ideally Western) source. It is not hard to imagine that the average European or American would be more sceptical of information reported from known foreign sources; however, information that comes from organizations seemingly based in Western Europe or the United States would be more readily accepted. In the age of social media, this criterion is easier to achieve than ever given the relative anonymity provided by the internet. Thirdly, and arguably most importantly, disinformation campaigns require “[sic] useful idiots” who will erroneously accept the distorted facts as reality and share them amongst their colleagues, friends, and family. In the age of social media, these “useful idiots” are more effective than ever given the tendency for provocative posts to “go viral” and for organizations’ ability to leverage bots to boost online content viewership. Anyone with a moderately active Facebook account simply needs to access their newsfeed and begin scrolling to see this third pillar of disinformation in action today.
Countering Disinformation in the American Context
“How often does it occur that information provided you on morning radio or television, or in the morning newspaper, causes you to alter your plans for the day, or to take some action you would not otherwise have taken, or provides insight into some problem you are required to solve?”[xix]
Neil Postman,
20th Century writer and cultural theorist
The US is currently facing a new chapter of foreign subversive propaganda and disinformation. Recognizing the difficulties experienced in protecting the integrity of the 2016 presidential election,[xx] the US has taken steps in recent years attempting to modernize and improve its ability to respond to disinformation. This includes the establishment and expansion of the Global Engagement Center, investigations carried out by the US security and intelligence community, and fostering cooperation with social media industry leaders. Internationally, the US also has played a part in counter propaganda and disinformation initiatives in conjunction with NATO.[xxi] This largely includes token US participation in the London and Brussels declarations – both of which identified disinformation as a significant threat facing the West.[xxii], [xxiii]
Apart from token responses to international declarations, US government messaging and counter disinformation activities today are coordinated through the Global Engagement Center (GEC).[xxiv] Operating under the US State Department, GEC represents the only domestic US department/agency specifically mandated to counter disinformation, propaganda, and other malicious influence activities within the information environment. Being first established by executive order in 2016 under the former Obama administration, GEC’s mandate and legislative direction has received several amendments over time. Most recently, the National Defense Authorization Act 2019[xxv] established GEC’s current mandate from which its lines of effort are derived. Specifically, this act identifies GEC’s mandate as follows:
direct, lead, synchronize, integrate, and coordinate efforts of the Federal Government to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation efforts aimed at undermining or influencing the policies, security, or stability of the United States and United States allies and partner nations.[xxvi]
While establishing and maintaining a lead agency to coordinate a whole of government approach to disinformation is undoubtedly important for combating disinformation, there are some important limitations on the GEC’s operational capabilities. More specifically, GEC does not conduct any public messaging itself. This contrasts with GEC’s predecessor organization, the Center for Strategic Counterterrorism Communications (CSCC). As its name suggests, CSCC was established in 2011 in response to the significant emphasis placed on propaganda by terrorist organizations such as al Qaeda and Daesh and an operational requirement for the US government to push back in the information environment.[xxvii] While CSCC’s effectiveness was hampered by a relatively small budget and limited personnel,[xxviii] it was able to contest the previously extremist-dominated information environment through its own messaging with limited but arguable success.[xxix], [xxx] The importance of this capability is echoed by the experiences of former Under Secretary of State for Public Diplomacy and Public Affairs, Richard Stengel, in his 2019 memoir Information Wars.[xxxi] As such, this represents a significant gap in the ability of GEC to counter disinformation and to have an impact on information operations in the 21st century.
The United States justice and national security and intelligence community has also played an increasingly significant role in combating disinformation and propaganda since the 2016 US presidential election. While much information concerning current national security threats and investigations are classified, some investigations from the FBI[xxxii] as well as the 2017 to 2019 Special Counsel Investigation led by David Mueller[xxxiii] have been (partly) released to the public. While the outcomes of these investigations are either still up for debate or have had arguably limited effect, they set important precedents for combating disinformation. Although not necessarily related to official government policy, precedent setting such as this is an important step in the right direction and could be reasonably expected to catalyse future policy changes. While there is no formula for fighting disinformation in liberal democratic societies, bringing it into the open is arguably a step in the right direction.
Building upon that, since the 2016 presidential election, the US government and the international community have made efforts to engage with private industries on countering disinformation. Namely, the US government has engaged social media and internet companies such as Facebook, Google, and Microsoft across a variety of fronts aimed at combating disinformation on their platforms.[xxxiv] Examples of this include Facebook’s fact-checking initiative as well as their initiative to provide information concerning verified identities, locations, and a history of pages and groups.[xxxv] Building upon that, social media industry giants have taken meaningful steps towards eliminating the use of bots on their platforms.[xxxvi] While these initiatives are a critical step in the right direction, the current US administration has not solidified this progress into legislation or created regulatory measures aimed at social media. This contrasts with nations across the world who have done so to different degrees.[xxxvii] Although internet regulation is a complex issue unto itself, given the threat of disinformation being spread on social media sites, it may be reasonable for the United States to create legislation around or regulate social media corporations. It goes without saying that significant discussion outside the scope of this paper would be called for in doing so while respecting protections for free speech.
The Way Forward?
Moving forward, there are several initiatives US policymakers could pursue to fill previously mentioned gaps in the US government’s ability to counter propaganda and disinformation. To this effect, the following are three policy recommendations designed to do so. Together, these three recommendations would facilitate more assertive and robust US posture against the threats of disinformation.
1. Issue a National Counter-Disinformation Strategy and give GEC the ability to carry out messaging itself
As previously mentioned, GEC today does not have a mandate nor the operational capability to disseminate information on its own. Despite its legislative predecessor, CTCC, having this exact capability, this was not passed on to GEC when its mandate was last updated in NDAA 2019. The ability to counter a foreign actor’s propaganda and disinformation with truthful information is a key aspect of securing the information environment.[xxxviii] While future NDAAs could be an appropriate vehicle for affecting these changes, other, more meaningful policy tools could be used. That is to say, the creation of a national US Counter-Disinformation policy, either through an act of Congress or through an executive order would not only be an effective vehicle for effecting such changes but would also send a strong message about the US government’s commitment to combating disinformation and propaganda in the 21st century.
2. Target and degrade the ability of foreign actors to make use of bots in partnership with social media industry leaders
The US security and intelligence community should continue to identify and target foreign actors conspiring to spread malicious disinformation. More specifically, rather than its current reactive posture, the US security and intelligence community should, through leveraging relevant intelligence, target the ability of foreign actors to spread disinformation before they are able to do so. While there would be many legal and operational challenges in doing so, one of the most readily available means of doing so would be by targeting internet bots. As previously mentioned, internet bots play a critical role in spreading disinformation on social media. Degrading foreign actors’ ability to use bots would result in a significant impact on their ability to leverage social media and the internet in spreading disinformation. This should be accomplished by cooperation with social media industry leaders to remove bots from their platforms. Building upon that, social media industry leaders should coordinate their aforementioned fact-checking efforts. Both of these initiatives in partnership with social media industry leaders would yield significant benefits in the fight against disinformation. As previously mentioned, regulating the social media industry in a similar fashion to other mass-media corporations may hold some benefits for combating disinformation. However, there remain significant questions outside the scope of this paper concerning how such regulations would be enacted.[xxxix]
3. Increase partnerships with the international community and leverage pre-existing partnerships
Since the challenges experienced during the 2016 presidential election, the US government and governments from around the world have begun establishing partnerships to combat disinformation and propaganda. Although US foreign policy has become more unstable since the 2016 presidential election, international partnerships remain a significant avenue and potential force multiplier in combating disinformation. While partnerships such as the NATO Strategic Communications Center of Excellence[xl] is one such example, there is little available information concerning the United States’ specific contributions to them. Through combining its efforts with that of other Western nations, the effectiveness of US counter disinformation policy would be magnified. The principle of international cooperation has formed the basis for much of the West’s collective prosperity from the end of the Second World War to the present. As previously mentioned, disinformation is an inherently borderless threat facing all Western liberal democratic nations. As such, the international community must be a part of any potential response to it.
Conclusion
With US 2020 presidential election campaigns in full swing, combating disinformation is more important than ever. Foreign state and non-state actors have made use of new tools such as social media and the internet at large to increase the effectiveness of their disinformation and propaganda campaigns. While different disinformation campaigns offer unique messages in pursuit of their respective strategic, social, or economic goals, they all share the overall goal of reducing the stability and security of the West. Beginning in the aftermath of the troubled 2016 US presidential election, the US has made some efforts to enhance its ability to respond to propaganda and disinformation along domestic and international lines of effort. That being said, by addressing each of the recommendations put forth in this paper, US policymakers could further enhance their ability to counter the threats posed by disinformation and propaganda. Disinformation and propaganda is inherently difficult to address in Western liberal democratic nations given our foundational tenants of openness and free expression of ideas. As such, Western nations combating disinformation are challenged to do so in a way that does not undermine the liberal democratic traditions at the foundation of our society.
About the Author
Alexander Fremis* currently serves as an Officer in the Canadian Army. Previously, he has served internationally as a part of NATO enhanced Forward Presence Battlegroup Latvia as well as on domestic operations within Canada in disaster assistance.
Alexander graduated in 2017 from the Regular Officer Training Program at the Royal Military College of Canada with a BA in Psychology and a Minor in Political Science. He is currently a graduate student at Wilfrid Laurier University pursuing a Master’s of Public Safety specializing in National Security. His primary research and professional interests include national security, intelligence, and the threat disinformation poses to 21st century Western liberal democracies.
Alexander can be contacted at Fremis.Alexander@gmail.com
**The opinions expressed in this work are solely those of the author and do not necessarily represent the views of any organization(s) affiliated with the author.
Notes
[i] I.M. Pacepa and R. Rychlak, Disinformation - Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism (Washington, DC: WND Books, 2013).
[ii] Neil MacFarquhar, “A Powerful Russian Weapon: The Spread of False Stories,” Atlantic Council, 26 August 2016, accessed 28 September 2020, https://www.atlanticcouncil.org/blogs/natosource/a-powerful-russian-weap....
[iii] Joshua Kurlantzick, “How China Ramped Up Disinformation Efforts During the Pandemic,” Council on Foreign Relations, 10 September 2020, accessed 28 September 2020, https://www.cfr.org/in-brief/how-china-ramped-disinformation-efforts-dur....
[iv] Government of the United States of America, “Assessing Russian Activities and Intentions in Recent US Elections,” Office of the Director of National Intelligence, 6 January 2017, accessed 28 September 2020, https://www.dni.gov/files/documents/ICA_2017_01.pdf.
[v] Pacepa & Rychlak, Disinformation - Former Spy Chief.
[vi] Richard Stengel, Information Wars: How We Lost the Global Battle against Disinformation & What We Can Do about It, First ed. (New York, NY: Atlantic Monthly Press, 2019).
[vii] The New York Times, “Operation Infektion: How Russia Perfected the Art of War,” YouTube, 21 January 2018, accessed 28 September 2020, https://www.youtube.com/watch?v=tR_6dibpDfo&t=1916s.
[viii] Pacepa & Rychlak, Disinformation - Former Spy Chief.
[ix] Government of the United States of America, “GEC Special Report : Pillars of Russia's Disinformation and Propaganda Ecosystem,” Homeland Security Digital Library, August 2020, accessed 29 September 2020, https://www.hsdl.org/?search=&searchfield=title or summary&all=Disinformation and Propaganda&collection=public&submitted=Search.
[x] “Deception Was My Job,” Yuri Bezmenov, YouTube, 11 April 2013, accessed 1 October 2020, https://www.youtube.com/watch?v=jFfrWKHB1Gc.
[xi] “Psychological Warfare Subversion & Control of Western Society,” Yuri Bezmenov, YouTube, 23 February 2011, accessed 1 October 2020, https://www.youtube.com/watch?v=5gnpCqsXE8g&t=1553s.
[xii] Pacepa & Rychlak, Disinformation - Former Spy Chief.
[xiii] The New York Times, “Operation Infektion”.
[xiv] North Atlantic Treaty Organization (NATO), “NATO Glossary of Terms,” 2019, accessed 27 October 2020, https://www.jcs.mil/Portals/36/Documents/Doctrine/Other_Pubs/aap6.pdf.
[xv] Ibid.
[xvi] Pacepa & Rychlak, Disinformation - Former Spy Chief.
[xvii] “Deception Was My Job,” Yuri Bezmenov.
[xviii] The New York Times, “Operation Infektion.”
[xix] GoodReads, “Neil Postman,” GoodReads. n.d., accessed 25 October 2020, https://www.goodreads.com/quotes/8710374-how-often-does-it-occur-that-in....
[xx] United States of America, Department of Justice, Report on the Investigation Into Russian Interference in the 2016 Presidential Election, Robert Mueller, March 2019.
[xxi] North Atlantic Treaty Organization (NATO), “NATO Strategic Communications Center of Excellence,” NATO StratCom, 2020, accessed 1 October 2020, https://www.stratcomcoe.org/.
[xxii] North Atlantic Treaty Organization, “Brussels Summit Declaration,” 30 August 2018, accessed 28 October 2020, https://www.nato.int/cps/en/natohq/official_texts_156624.htm.
[xxiii] North Atlantic Treaty Organization, “London Summit Declaration,” 4 December 2019, accessed 28 October 2020, https://www.nato.int/cps/en/natohq/official_texts_171584.htm.
[xxiv] Government of the United States of America, “Global Engagement Center,” Global Engagement Center, n.d., accessed 1 October 2020, https://www.state.gov/bureaus-offices/under-secretary-for-public-diploma....
[xxv] United States Congress, House of Representatives, National Defense Authorization Act For Fiscal Year 2019, Washington, DC, Government of the United States of America, 2018.
[xxvi] Ibid.
[xxvii] Stengel, Information Wars : How We Lost.
[xxviii] Ibid.
[xxix] Ibid.
[xxx] Elise Labotte, “State Department Releases Graphic Anti-ISIS Video,” CNN, 8 September 2014, accessed 1 October 2020, https://www.cnn.com/2014/09/05/world/state-department-anti-isis-video/in....
[xxxi] Ibid.
[xxxii] Government of the United States of America, “Russian National Charged with Interfering in US Political System,” United States Department of Justice, 19 October 2018, accessed 29 September 2020, https://www.justice.gov/opa/pr/russian-national-charged-interfering-us-p....
[xxxiii] United States of America, Department of Justice, Report on the Investigation Into Russian Interference in the 2016 Presidential Election, Robert Mueller, March 2019.
[xxxiv] “Transcript of Mark Zuckerberg's Senate Hearing,” The Washington Post, 10 April 2018, accessed 1 October 2020, https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-....
[xxxv] Facebook, “Working to Stop Misinformation and False News,” Facebook. n.d., accessed 1 October 2020, https://www.facebook.com/formedia/blog/working-to-stop-misinformation-an....
[xxxvi] “Massive Networks of Fake Accounts Found on Twitter,” BBC, 24 January 2017, accessed 1 October 2020, https://www.bbc.com/news/technology-38724082.
[xxxvii] Government of the United States of America, “Government Responses to Disinformation on Social Media Platforms: Comparative Summary,” Library of Congress, 24 July 2020, accessed 1 October 2020. https://www.loc.gov/law/help/social-media-disinformation/compsum.php.
[xxxviii] Stengel, Information Wars : How We Lost.
[xxxix] Jonathan Wareham, “Should Social Media Platforms Be Regulated?” 10 February 2020, accessed 29 October 2020, https://www.forbes.com/sites/esade/2020/02/10/should-social-media-platfo....
[xl] NATO, “NATO Strategic Communications Center of Excellence.”
Image credit: https://www.nato.int/cps/en/natohq/177273.htm