Skip to content
Menu
menu
Illustration of the statue of liberty's hand, with her torch trailing a flame of speech bubbles, emojis, and exclamation marks

Illustrations by iStock; Security Management

How Malicious Actors Carry Out Foreign Influence Attacks

The 2024 U.S. presidential election won’t be the first election where a foreign government attempts to exert overt or covert influence upon the policies and decisions of another nation. And it’s unlikely that it will be the last.

Within the current election landscape, these malign actions are geared towards generating a greater divide and discord among U.S. voters, with the aim of undermining democratic political systems.

“Foreign malign influence operations are not new; however, technology developments have enabled actors to conduct operations while more effectively hiding their identities,” the U.S. Cybersecurity and Infrastructure Security Agency (CISA) reported in a recent guide on election security.

who-is-on-the-attack-100x100.pngWho is On the Attack

The U.S. Office of the Director of National Intelligence has identified three nation-states that continue to conduct influence operations to undermine confidence in U.S. democratic institutions:

  • Islamic Republic of Iran
  • People’s Republic of China (PRC)
  • Russian Federation

their-goal-100x100.pngTheir Goals

Nation-state actors use various tactics in their influence operations, but the aims are similar.

“Since at least 2016, we have seen foreign malign influence campaigns specifically promote messaging that undermines public confidence in the security and integrity of the American elections process and exacerbate partisan tensions,” CISA reported. The goals in these campaigns include:

  • Aggravate existing social divides
  • Increase polarization
  • Push narratives that fit into the nation-state’s objectives
  • Experiment with generative AI to enable efforts

how-to-be-an-influencer-100x100.pngHow to Be an Influencer

Foreign influence campaigns may involve different methods, with some of them part of a coordinated effort.

“Each foreign actor uses influence operations in unique ways,” the CISA guide said. “Their efforts can use a mixture of overt and covert methods to spread information, engage with key groups, or sow division.”

While these efforts might be most recently focused on the 2024 U.S. elections, nation-state actors use these tactics against various democratic systems all over the world.

Disguising proxy media. Actors impersonate legitimate media outlets and local news sources to spread propaganda or fictious content.

Example: In late 2022, a pro-Chinese influence operation was discovered promoting content of AI-generated video footage.

Voice cloning or deepfakes of public figures. The general public or a targeted individual is misled with a fabricated recording of a public official.

Example: WIRED reported that two days before Slovakia’s parliamentary elections in late 2023, a fictious audio file of one of the political candidates was posted on Facebook. On the recording, the candidate, Michal Šimečka, appeared to be discussing how to rig the election. Although the candidate and others denounced the audio as fake and fact checking indicated it had been manipulated by AI, Šimečka lost the election to a candidate who had campaigned to withdraw the nation’s military support for neighboring Ukraine.

Cyber-enabled information operations. Foreign actors compromise an organization’s IT systems to detect and leak damaging private information.

Example: Nation-state actors for the Russian Federation leaked trade documents concerning UK-U.S. trade before the UK’s 2019 election, with some of the information then used by the opposition’s party during the campaign. 

Manufacturing false evidence of a security incident. In such instances, a malicious actor creates and spreads a fake report of a physical security or cybersecurity incident.

Example: The FBI and CISA issued an announcement on 12 September about the ongoing attempts to use voter registration information as evidence of a nonexistent cyberattack that compromised election infrastructure.

Paid influence. Nation-state actors pay online influencers, PR firms, or journalists to launder their messages for them.

Example: Pro-PRC actors paid YouTube accounts and influencers to counter criticisms of China’s human rights problems, according to the Australian think tank ASPI.

Leveraging social media platforms. Foreign nation-state actors use social media platforms to spread selective narratives to specific audiences.

Example: In 2022, Stanford University’s Internet Observatory found that suspected Russian actors used social media platforms to target right-wing U.S. audiences and spread political messages that would generate distrust in democratic systems and undermine support for Ukraine. 

 

Sara Mosqueda is associate editor for Security Management. You can connect with by sending her an email at [email protected] or via LinkedIn.

 

 

arrow_upward