Home / FIE / Foreign Interference in Elections 2022 and 2024: What Should We Prepare For?
Foreign Interference in Elections 2022 and 2024: What Should We Prepare For?
March 15, 2021
Post by Clint Watts and Rachel Chernaskey
The 2020 election played out far differently than its 2016 predecessor in the information space. Research into the spread of misinformation and disinformation online flourished following Russia’s 2016 interference attempts in cyber, on social media, and through overt propaganda. Simultaneously, the U.S. government, Silicon Valley and the American mainstream media reassessed how to handle foreign interference in the information space targeting the 2020 presidential election, and much of this new research into the misinformation/disinformation problem informed recalibrated methods for combating such attacks.
But ultimately the largest disinformation peddlers in 2020 were Americans themselves and research efforts—largely focused on easily accessible APIs, text-based content, in English language—while insightful, missed key nodes of influence targeting Americans. The methods, mediums and messengers pushing propaganda, misinformation and disinformation have changed dramatically since 2016. Research examining the problem of malign influence, online manipulation, and the spread of misleading or false information must evolve, too. Having monitored and documented malign influence efforts in the last two presidential elections and the 2018 midterm elections, here are a few focus areas for future research to further combat malign influence and prepare for a divided information world.
1. Detecting and assessing malign audio and video content
There has been much discussion over Russia’s use of Twitter bots and Facebook personas during the 2016 election. Yes, bots still exist and coordinated inauthentic campaigns still occur, but such efforts are propagated these days by a wide-ranging set of actors and text-based content no longer carries the same weight in malign influence operations. Video and audio-focused apps—TikTok, YouTube, Clubhouse, Snapchat, Bitchute—have exploded in popularity, so much so that applications traditionally focused on text (like Twitter and Facebook) have implemented video into their functionality and regularly see content from other video-based platforms uploaded and embedded into posts. Both of these rising mediums—audio and video—are far more challenging to police and moderate than their text-based counterparts.
Moving forward, solutions for detecting and mitigating audio- and video-based manipulation are crucial as both mediums have been increasingly used by malign influence actors. For example, Andrei Derkach, the Russian agent sanctioned for attempting to interfere in the 2020 election and denigrate then-former Vice President Biden, employed YouTube (the channel was eventually removed) as a central vehicle for posting the leaked audio files upon which his “NABU Leaks” information campaign rested. Clubhouse’s rapid rise as an audio-centered social media platform, for its part, has seen a slew of Kremlin-aligned influencers—some of them sanctioned by the United States for election interference efforts—join the platform. Live-streaming also presents a separate, more challenging issue for platform moderation, a capability that allows for damage to be done in real-time, like in the case of the New Zealand Christchurch massacre. Combating malign influence requires that researchers and social media defenders move to where the most prolific manipulators reside—in 2021, it’s audio and video.
2. China’s influence efforts: more finance than information
Amid the 2020 election, American officials frequently pointed to China as the biggest election interference threat. It was Russia, however, that remained the most prolific threat for election interference in the information space and Iran taking the most antagonistic posture. For the Chinese Communist Party (CCP), there’s little chance they will be able to win over American voters for a presidential candidate in the way Russia interfered in Election 2016 or tried in Election 2020. While CCP initiatives have led to an expansion of state media targeting audiences abroad, China’s financial leverage remains a more viable method for influence than the information space. Chinese state-backed outlets have purchased loads of advertising space in U.S. newspapers and last year the FCC ordered the end of Chinese programming broadcast in the U.S. via a Mexican radio station. Inauthentic Twitter bots and YouTube channels posting CCP-aligned content garner little interest from English-language audiences, and even those posted only a small amount of election-related content. China has other methods and means to meddle in U.S. elections.
Rather than evaluating China’s electoral interference via the lens of Russian active measures, China’s political influence in America should focus on the financial sector rather than the information space, and on Congressional races more than presidential contests. China’s political objectives in America focus on policy, and Congress pulls the nation’s purse strings and authors the regulation that impacts trade, technology and human rights—three issues of critical interest to the (CCP). Political donors with ties to the CCP have made large contributions to U.S. campaigns and gained access to high-level U.S. officials in political leadership. Look here for contributions to political campaigns and key leader meetings to understand potential malign influence headed into the 2022 midterms.
3. Internetsiloing and increased authoritarian control
Authoritarian regimes have long thought of the Internet as a threat to the control of their populations and the survival of their regimes. China’s CCP responded early to the rise of the Internet, building its own internal, controllable Internet environment and cutting off the free Internet via its Great Firewall. Other authoritarian governments, most prominently Russia but also others like Saudi Arabia and Pakistan, operate in a balancing act. They shut down or slow web or social media apps when necessary, but retain many Western platforms for now. In the case of Russia, the Kremlin seeks to use action against Western tech and social media platforms as leverage to shape strategic narratives—largely turning the problem on its head and alleging Western censorship, control or manipulation.
Separately, at home, Western social media companies continue facing pressure to moderate content related to extremism and electoral interference. These regulations and resulting terms of service modifications will also impact the ways in which manipulators seek to undermine the integrity of democracy through changing methods. Future research into electoral interference should examine how dynamics between American tech and foreign governments in these countries as well as restrictions and barriers to content delivery shape perceptions and how these modifications to the free flow of the Internet will result in new innovative work-arounds by the most advanced actors determined to interfere in elections.
4.That other than English and other than Facebook, Twitter & YouTube
Most efforts to thwart disinformation headed in last November’s election focused on English language content. Meanwhile, Spanish-language misinformation and disinformation circulating online went largely unnoticed and unchecked representing the biggest miss in the 2020 election. Separately, newly emergent social media applications, most notably Parler, offered alternative vectors for the spread of falsehoods. Smaller, lesser policed platforms have since been pathways for seeding conspiracies and hosting content that can then be shared on the larger social media platforms or in closed chats. Future Western electoral contests should expand their vulnerability assessments to anticipate in what ways advanced manipulators will adapt in order to reach voters on lesser moderated platforms or closed-group discussions.
These recommendations for future research to stop malign influence in Western elections bring a close to the FIE 2020 project. Thanks to the support of the Foreign Policy Research Institute and the Democracy Fund, we’ve been able to study this problem in great depth and provide more than 65 analyses on the 2020 election and interference efforts stemming from Russia, Iran and China. While state-backed media outlets, like those analyzed here in the FIE 2020 project, still provide a valuable avenue for examining narratives and disinformation stemming from foreign governments, we hope to see research surrounding election interference expand and increase on this timely and pressing issue.