Mis- and Disinformation Wildfires: Stopping the Spread of Election Influence Campaigns

5 minute read

The 2020 U.S. presidential election was unprecedented in many ways, most of which are too political in nature for me to address. That being said, while elections are inherently political, the tactics used by threat actors to influence election outcomes are bipartisan; a risk to all government entities that support the election process on both sides of the aisle. 

Adversaries may have desires for specific candidates to win US elections, but they are unlikely to have much allegiance. This means that the party that is helped in one election could be harmed in the next, because the real goal of election interference is not always to choose a leader for another nation. The true objective is just as likely to convince the populace that the election process itself is corrupted and untrustworthy. The target is not Democratic or Republican. The target is democracy.

Coming off of the 2016 U.S. presidential election, ZeroFox anticipated the following in 2020:

  1. We expect Russia may modify its election interference tactics based on the attention towards social media mis-, dis-, and mal-information (MDM), focusing less on bots and fake personas and more on a traditional IO campaign using a myriad of state-owned media properties.
  2. We believe other actors, foreign and domestic, will copy aspects of the Russian playbook in 2020 with a combination of disinformation and social-media amplification (bots and other fake personas). The addition of other actors will make it more difficult to attribute disinformation campaigns writ-large.
  3. Russia and other disinformation actors will likely continue to capitalize on underground sites to spawn disinformation campaigns and conspiracy theories, making it difficult to find origins via social media.
  4. Governments may have difficulty fighting against disinformation, especially with the involvement of local actors, which can present legal restrictions and accusations of free speech suppression. This means private organizations, research organizations, and social networks need to be involved and coordinated to assist in ensuring voters have the correct and necessary details to make informed decisions.

Unfortunately, most of our assessments proved to be accurate; validated in January 2021 by a declassified Intelligence Community Assessment (ICA) provided to the President, senior government officials, and members of Congress.

Unlike Other Tactics

Notably, the ICA concluded that there were “no indications that any foreign actor attempted to alter any technical aspect of the [2020] voting process…including voter registration, casting ballots, vote tabulation, or reporting results,” and that attempts to gain access to election infrastructure were not identified. What the assessment did conclude was that Russia and other foreign actors used influence campaigns to push out narratives meant to sway voters. This was their primary tactic for impacting the outcome of the 2020 election. 

The most important infrastructure there is, is our cognitive infrastructure.

CISA director Jen Easterly, 2021 Tortoise Cyber Summit Discussion

What makes influence campaigns such an effective and attractive tactic for foreign and domestic adversaries wanting to sway American voters? 

  • Influence campaigns are cheap, scalable, and don’t require access to secure systems.
  • Social media enables fast, wide dissemination that often obfuscates the original source.
  • Successful influence campaigns attack truth, and a large segment of America is unaccustomed to (or uninterested in) the principles of fact-checking and source validation.

Steps to Prepare for Malicious Influence Campaigns

In April, a consortium of over 50 countries formalized a declaration to combat disinformation and censorship in response to rising threats during the Russia-Ukraine conflict. According to one article, the declaration “… will advance a positive vision for digital technologies anchored by democratic values.” 

In the meantime, what steps can agencies take to prepare for the onslaught of domestic and foreign influence campaigns aiming to impact future elections? As the ICA concluded, infrastructure and access are not the primary targets. Instead, agency security teams need to place a higher priority on monitoring digital assets to identify what adversaries are planning and shut down misinformation as soon as possible. Similar to combatting a wildfire, stopping misinformation before it spreads can prevent exponential harm.

Another critical component to combatting influence campaigns is broader education and awareness. Referring back to the wildfire analogy, a well saturated field won’t easily ignite. The more that agencies promote awareness of misinformation – and educate citizens on the possible tactics being used to influence them as a means of “saturating the field” of human targets – the greater the odds of minimizing the spread of dangerous falsehoods. 

Misinformation and Disinformation Examples

CISA’s Rumor Control Site is a great example of how the government built awareness around election misinformation/disinformation in 2020. As part of CISA’s efforts to help secure election infrastructure, the organization started the Rumor Control Site, “to debunk common misinformation and disinformation narratives and themes that relate broadly to the security of election infrastructure and related processes.” 

New Rumor vs. Reality from CISA’s Rumor Control Site shares some of the most common vulnerabilities in elections.

Not only was the site a success in getting the facts out but, according to Jen Easterly, it was also helpful for state and local officials attempting to cut through the noise surrounding the election. 

Another good example of government efforts to raise citizen awareness is the Census Bureau asking citizens to report misinformation in advance of the 2020 Census. According to NextGov, “The bureau launched a website for dispelling common Census rumors and created a special email address where people can report misinformation and other malicious activities.” Educating citizens on the types of misleading narratives and then asking for help in reporting suspicious activity was one way that the Bureau “saturated the field.” 

Focusing on misinformation and influence campaigns as the most critical threat to elections does not mean other threats do not exist. Email phishing campaigns and compromised voter websites are other ways that adversaries effectively influence the election process. These actions are just not as prevalent, easy, or (to date) effective as misinformation campaigns have proven to be. As we look ahead to elections in 2022, 2024, and beyond, focusing on combating misinformation campaigns will likely yield the greatest results in terms of protecting the integrity of our elections and the reputation of America as a model for a healthy representative democracy.

For more insights on how agencies can defend against damaging influence campaigns and the misinformation threat, watch our on-demand webinar Ripped from the Headlines.

See ZeroFox in action