The Gray Scare: The new colors of Russia's covert cyber effort in the US elections & beyond
By Shannon Welch
The 2020 US presidential election is being reported as a pivotal moment for NATO. The election will determine the US relationship with fellow NATO allies, affecting the defense strategies of all partner countries. Due to the critical outcome of this election, pro-Russian actors are flourishing ahead of it, strengthening concerns of another large-scale misinformation campaign, an echo of the hack-and-dump campaign of the 2016 election.
On October 15, 2020, six Russian intelligence officers were charged by the United States Department of Justice with multiple counts of cyber espionage in response to an aggressive worldwide hacking campaign that caused mass disruption and cost billions of dollars from multiple NATO countries and partner countries.[i] The campaign started in 2015 and lasted until November of 2019, affecting the French elections, Ukrainian infrastructure, Georgian government entities, and hospitals in over 20 countries including the US.[ii]
This indictment echoed back to the hack-and-leak methods used before the 2016 US presidential elections against the US and NATO countries. The techniques used in 2020 have not been reduced by additional rules from social media companies since the last US election, but instead have been turning more malicious. As the DOJ stated, the five-year campaign was the “most disruptive and destructive series of computer attacks ever attributed to a single group”.[iii]
A standard Russian misinformation campaign attempts to disturb foreign policymakers’ decisions through democratic unrest and uncertainty. This is accomplished through the erosion of confidence in the target country’s government, media, and trust in the democratic system.[iv] This is accomplished by pandering to both sides of the divide, making far-right, far-left, and purposefully false “unbiased” content. This explosion of information overwhelms the populations’ ability and will to distinguish between fact and fake news.[v]
In a recent three-year study published in Human Communication Research,[vi] it is demonstrated that both the extreme right and extreme left contribute to the spread of this misinformation. In a Facebook sample from the study, “those self-identified as extremely conservative—7 on a scale of 1 to 7—accounted for the most fake news shared, at 26%. In the Twitter sample, 32% of fake news shares came from those who scored a 7. But those who scored a 1, identifying as extremely liberal, also shared fake news frequently, accounting for 17.5% of shares on Facebook and 16.4% on Twitter.”[vii]
RAND characterized Russian misinformation methods into three categories in a 2018 report[viii]: white (explicit) media, gray (uncertain) media, and black (covert) media. The different forms of media target different levels of media literacy and are used to influence a diverse population. The misinformation campaigns have been in operation for years but have found additional opportunities to spread misinformation amid the uncertainty of the US presidential election.
Method One: White Media
White media is a form of information sharing where state-sponsored media outlets publish pro-Russia reports or publish misinformation on true events. It is referred to as ‘explicit’ media due to its ability to be directly traced back to the Russian government. White media spans over all mediums of media, including TV, radio, newspapers, and podcasts. Examples of this media include the RT network, Radio Rossii, Vedomosti, and Rossiyskaya Gazeta.
These networks and media outlets are often owned directly by the Kremlin or by private companies loyal to the Kremlin. This allows for censorship throughout the company on what stories should be run and what language should be used to report them. This process helps to “spin the story” in a direction in favor of the government.
Russian networks like RT are doing this by catering to both the far-left and far-right in US-centered news. These news networks bring in ‘experts’ from the US, usually writers who have been unable to publish on other more reputable networks due to their extreme beliefs and bias. Anti-globalization advocates and hard-left socialists have been attracted to the confrontational headlines from RT, which has been publishing more with the increase in interest, trying to accomplish a new Cold War-like directive.[ix]
Since the Ukraine crisis began, Russian white media has fortified the pro-Kremlin and xenophobic tone of their broadcasts with an aggressive dismissal of and attacks on Western influence. Russia is a consistent target for condemnation from media freedom watchdogs in the US and Europe.
Russian journalists are at risk of physical attack or even murder if they investigate too deeply into delicate topics such as corruption, organized crime, or human rights violations. According to data published by the Committee to Protect Journalists, an international watchdog group, in Russia 58 journalists have been killed since 1992.[x] On average, seven journalists go missing and seven journalists are jailed per year in Russia.[xi] This creates a vortex of fear within the Russian journalism community to keep creating stories to fit the Russian narrative, whether they agree with it or not, for their safety.
Method Two: Gray Media
Gray media is the most difficult to find a single definition for, as it is transforming and rebranding itself often. Gray media is a combination of conspiracy theory publishers, “thought leaders”, and data dumping websites. This media is customarily spread via social media platforms.
The importance of gray media is its ability to shape the content and cause further divisiveness between already split populations. This is often the “fake news” found on the platforms Facebook and Twitter, as the websites are purposefully formatted to look like reputable news websites. These websites may have famous faces attached to them, like former White House Chief Strategist Steve Bannon (Breitbart) or radio host Alex Jones (InfoWars). These media figures have cult followings that grow as more false information is spread. Gray media has seen a spike during the Trump Administration as President Trump and White House officials have repeatedly retweeted false information from both sources of white and gray media.[xii]
They may also be websites with dumped confidential information on them, including a mixture of real and fake government documents, emails, or communications. These data dumping websites are reliant on black media to create an aura of reputability. Black and gray media work hand-in-hand to create and increase confusion and mistrust in the target population. Hilary Clinton’s email leak of 2016 is the most well-known example of gray and black media working together in a presidential election targeting campaign.
Method Three: Black Media
Black media includes trolls, honeypots, social media aggressors, and hackers. Black media is used to aid in the formation of gray, and sometimes white, media. This media can be done by an individual or government entity; some individuals are even recruited as non-governmental agents to execute these methods. It is difficult to trace back to a single individual or entity, due to precautions like proxies made to keep the user or users undisclosed. Examples of black media include groups like FANCY BEAR, Pawn Storm, and Sofacy Group.[xiii]
The most well-known to NATO is the group FANCY BEAR (also known as APT28), which has been operating since 2008 and signifies a constant threat to a large group of organizations and governments around the globe. They target the aerospace, defense, energy, government, media, and dissidents, using sophisticated and cross-platform implants.[xiv] Because of its widespread attacks against NATO allies, FANCY BEAR’s goals often reflect the strategic interests of the Russian government and may indicate an association with the GRU, Russia’s primary military intelligence service.[xv]
Trolls, bots, and honeypots all reference fake social media accounts used for malicious purposes. Trolls and honeypot accounts are defined as being operated by a user or a group of users, while bot accounts are programmed to automatically create content.[xvi] Trolls and bots are normally used to spread specific narratives, while honeypots are used to solicit information and compromise accounts.[xvii] Hackers deface websites, execute DOS attacks, and extract confidential information.[xviii] Hackers use gray media’s data dumping websites to share this stolen information to help fuel controversy and confusion.
How can NATO help target Russian misinformation in the US elections?
White Media
Each form of media will need to take a different approach to help prevent the spread of white media. The most basic and important way to prevent the spread of white media is nation-wide media literacy programs. Media literacy programs have been popping up across the US and NATO countries for the past few years, mostly by non-profit groups like the Admongo, Common Sense Campaign, and the Not So Fast Campaign. These groups target young internet-users, ages 8+, to learn how to decipher between reputable and misleading online information.
NATO has publicly stated that it believes fact-based, credible media is the best way of countering disinformation. This is based on the Alliance’s core values of democracy, freedom of speech, and the rule of law.[xix] NATO is publishing media literacy documents in multiple languages, including Russian, in hopes to help combat this issue on the Russian front as well. NATO is doing this in tandem with independent publishers, to help educate publishers on how Russia could be trying to infiltrate reputable news agencies.
Gray Media
Gray media will be the most complicated to address, as it often hides under the justification of being “free speech”. Free speech is a core value of all NATO countries, making it difficult to combat conspiracy theories and commentary pieces that may be targeting at-risk populations. NATO currently relies on social media platforms to help decipher between targeted misinformation and true cases of free speech. Social media platforms address this issue through fact-checking and an increase in social media monitoring.
Facebook, Twitter, Youtube, and Instagram all utilize different methods in their approaches to disinformation on their platforms. These platforms are often criticized for their responses to misinformation management. The platforms have to find a delicate balance between protecting free speech and protecting viewers from targeted misinformation. Social media platforms rely on algorithms set up to catch fake media, though they are not overwhelmingly successful, often flagging unrelated content or reputable news media.
Another method used by social media companies is marking possibly misleading information. Facebook, Instagram, Twitter, and Youtube have attached warnings to the content they believe to be misinformation or may include false information. This allows a user to determine if they find the information to be misleading, still giving them the option to see it. It gives the user resources to compare the disputed content against trusted news sources. While this has seen more success, it has still caused displeasure from first-amendment groups claiming it stifles free speech.
Black Media
NATO and the US already have developed departments to handle hacking and illegal internet actions. The problem will always be staying ahead of the issue. More funding and time will need to be dedicated to cyber defense capabilities, as well as furthering STEM education among NATO states. There cannot be a weak link, because hackers will target the weakest government system to access the rest of NATO. It is NATO’s job to make sure that even its poorest countries are educated and protected in the cyber realm.
Without a unified effort from all NATO member states, the ability to counter Russian disinformation will be impossible. Social media targeting campaigns are clever and damaging tools against the foundation of democracy. They target vulnerable populations to create a deeper divide on issues that are distorted through a pro-Russia lens. With the US elections only days away, it will be incredibly important to continue these efforts and learn from the election in order to help prevent further manipulation in future elections of NATO states.
About the Author
Shannon Welch provides governmental and international affairs research support impacting areas of national security. Mrs. Welch has a graduate degree in National Security and Cyber Intelligence from Daniel Morgan Graduate School and an undergraduate degree in Global Conflict, with a minor in Coastal Biology.
Furthermore, Shannon Welch’s experience includes working with NGOs and private consulting firms, which have included: The North Atlantic Treaty Organization (NATO), Department of Justice, Department of Defense, and The Heritage Foundation. She brings a wealth of knowledge to the digital targeting space, where she worked to support multiple political campaigns comprised of US Senate, Gubernatorial, and State and Local issues focusing on counterterrorism, cyber, elections, and foreign policy.
Notes
[i] United States Department of Justice, 2020, Six Russian GRU Officers Charged in Connection with Worldwide Deployment of Destructive Malware and Other Disruptive Actions in Cyberspace, News Brief, Washington DC: United States Department Of Justice.
[ii] United States Department of Justice, 2020, Six Russian GRU Officers Charged in Connection with Worldwide Deployment of Destructive Malware and Other Disruptive Actions in Cyberspace.
[iii] FBI National Press Office, 2020, FBI Deputy Director David Bowdich’s Remarks at Press Conference Announcing Cyber-Related Indictment of Six Russian Intelligence Officers, Press Release, Washington DC: FBI National Press Office.
[iv] Todd Helmus, Russian Social Media Influence Understanding Russian Propaganda in Eastern Europe (Research, Santa Monica: RAND, 2018).
[v] Ibid.
[vi] Toby Hopp, “Why Do People Share Ideologically Extreme, False, and Misleading Content on Social Media? A Self-Report and Trace Data–Based Analysis of Countermedia Content Dissemination on Facebook and Twitter,” Human Communication Research (Oxford Press) 46, no. 4 (2020): 357–384.
[vii] Ibid.
[viii] Ibid.
[ix] Casey Michel, “Putin’s Magnificent Messaging Machine,” Politico, 25 August 2015, accessed 26 October 2020, https://www.politico.com/magazine/story/2015/08/25/putin-rt-soviet-propa....
[x] Committee to Protect Journalists, 2020.
[xi] Committee to Protect Journalists, 2020.
[xii] Tommy Beer, “Trump Tweets Out Fake Story Criticizing Biden From Satirical News Site,” Forbes, 16 October 2020, accessed 26 October 2020. https://www.forbes.com/sites/tommybeer/2020/10/16/trump-tweets-out-fake-....
[xiii] FBI National Press Office, NSA and FBI Expose Russian Previously Undisclosed Malware Drovorub in Cybersecurity Advisory, 13 August 2020, accessed 26 October 2020.
[xiv] Brandon Valeriano, “Fancy bears and digital trolls: Cyber strategy with a Russian twist,” Journal of Strategic Studies, 2019
[xv] Ibid.
[xvi] Andrew Weisburd, “Trolling for Trump: How Russia Is Trying to Destroy Our Democracy,” War On The Rocks, November 2016, accessed 26 October 2020, https://warontherocks.com/2016/11/ trolling-for-trump-how-Russia-is-trying-to.
[xvii] Ibid.
[xviii] Ibid.
[xix] NATO, 2020, NATO’s approach to countering disinformation: a focus on COVID-19.