Skip to content
Menu
menu

Illustration by iStock; Security Management

In a Vital Election Year, Disinformation Woes Top the Chart of Global Risks

Risk specialists surveyed by the World Economic Forum (WEF) are feeling rather pessimistic about the near future, with 30 percent of global experts foreseeing an increased likelihood of global catastrophes in the next two years. In addition, nearly two-thirds of the specialists surveyed said they are expecting catastrophes in the next decade.

The Global Risks Report 2024—the latest in a series of annual reports from the WEF—noted a dramatic rise in concern about polarization and misinformation from its 2023 report.

“Emerging as the most severe global risk anticipated over the next two years, foreign and domestic actors alike will leverage misinformation and disinformation to further widen societal and political divides,” the report said. “As close to three billion people are expected to head to the electoral polls across several economies—including Bangladesh, India, Indonesia, Mexico, Pakistan, the United Kingdom, and the United States—over the next two years, the widespread use of misinformation and disinformation, and tools to disseminate it, may undermine the legitimacy of newly elected governments. Resulting unrest could range from violent protests and hate crimes to civil confrontation and terrorism.”

Globally, 2024 holds at least 83 elections—the largest concentration for at least the next 24 years. By some estimates, more than 4 billion people could potentially cast ballots this year, The New York Times reported.

Taiwan is one of the first nations to the polls, and it is already facing widespread disinformation and influence campaigns from China, including alleged use of deepfake videos and online conspiracy theories. According to The Guardian, “Election disinformation has amplified furores over egg shortages, Taiwan’s submarine production, political and sex scandals, and Taiwan’s readiness for war, fueling fears over conscription and young people being forced to fight, as well as casting doubt over the U.S.’s support. Authorities have also warned the public about ‘electoral misinformation’ including claims of video surveillance inside voting booths, and hidden ballot boxes.”

Meanwhile, the United States is gearing up for a contentious presidential election this year. The FBI released a summary of how its divisions and partners are preparing to protect voting rights and prosecute election crimes, including foreign intervention, malign influence efforts, covert disinformation campaigns, and international or domestic terrorism that could affect election security.

Beyond elections, though, the WEF warns that perceptions of reality are likely to be more polarized by mis- and disinformation, affecting the public discourse on public health and social justice, tempting governments to censor information, and increasing the risk of domestic propaganda.

“In response to mis- and disinformation, governments could be increasingly empowered to control information based on what they determine to be ‘true,’” the report added. “Freedoms relating to the Internet, press, and access to wider sources of information that are already in decline risk descending into broader repression of information flows across a wider set of countries.”

Misinformation and disinformation were linked closely to a number of other high-profile, near-term risks in the Global Risks Report, including societal polarization, which can lead to interstate violence, terrorist attacks, interstate armed conflict, and the erosion of human rights.

Technology also plays a major role in the spread of mis- and disinformation worldwide, including during election cycles. While politically focused chatbots could be used to inform voters about key issues, artificial intelligence (AI) is also being used to spread disinformation and lend credence to conspiracy theories, the Times reported.

In an October letter, Michigan Secretary of State Jocelyn Benson wrote that “AI-generated content may supercharge the believability of highly localized misinformation,” such as using AI tools to mislead voters about wait times, closures, or violence at polling locations.

Because AI tools are becoming easier to use, they are more accessible to a wide variety of people who can quickly create professional-looking images and communications. These AI models “have already enabled an explosion in falsified information and so-called ‘synthetic’ content, from sophisticated voice cloning to counterfeit websites,” the WEF report explained. AI systems’ hallucinated content may also create unintentional misinformation.

Although some governments are taking action—including China, where AI-generated content must be watermarked—“the speed and effectiveness of regulation is unlikely to match the pace of development,” the WEF said.

“The difference between AI- and human-generated content is becoming more difficult to discern, not only for digitally literate individuals, but also for detection mechanisms,” the report noted.

Synthetic content and falsified information will manipulate individuals, damage economies, and fracture societies in the near future, and criminal groups will also likely take advantage of false information and synthetic content to manipulate stock markets, create deepfake pornography, and spread disinformation to their benefit.

Additionally, the availability and use of deepfake videos and AI-generated content weakens individuals’ sense that they can recognize the truth, undermining trust in information sources overall. The information doesn’t even need to be AI-generated to be suspicious today—given current levels of distrust in government and media as sources of false information, all it could take is someone raising the question of whether or not something was fabricated to achieve relevant objectives, the WEF said.

In the face of this uncertainty and division, domestic actors could leverage related societal unease to seize more control over information sources, place stricter limits on Internet freedoms, and suppress dissenting voices, including journalists and political opponents.

“The export of authoritarian digital norms to a wider set of countries could create a vicious cycle: the risk of misinformation quickly descends into the widespread control of information which, in turn, leaves citizens vulnerable to political repression and domestic disinformation,” the report said.

So, what else did the Global Risks Report 2024 call out? See below for the top short-term and long-term risks cited in the report.

Top Short-Term Risks (2 Years)

  1. Misinformation and disinformation
  2. Extreme weather events
  3. Societal polarization
  4. Cyber insecurity
  5. Interstate armed conflict
  6. Lack of economic opportunity
  7. Inflation
  8. Involuntary migration
  9. Economic downturn
  10. Pollution

Top Long-Term Risks (10 Years)

  1. Extreme weather events
  2. Critical change to Earth systems
  3. Biodiversity loss and ecosystem collapse
  4. Natural resource shortages
  5. Misinformation and disinformation
  6. Adverse outcomes of AI technologies
  7. Involuntary migration
  8. Cyber insecurity
  9. Societal polarization
  10. Pollution

 

arrow_upward