
The 2020 United States presidential election was unprecedented in terms of the implications it posed for the legitimacy of elections and for the future of democracy itself. Former president Donald Trump’s refusal to accept the election outcome formed the basis of the Big Lie conspiracy, which is rooted in the claim that the election was stolen from Trump and that Republicans who conceded that Joe Biden had won were involved in a plot to undermine their own party’s presidential candidate (Tindall, 2023). Consequently, a third of American citizens currently believe that the election outcome was fraudulent (Kamisar, 2023).
As the 2024 presidential election approaches, there are widespread concerns that the actions of Trump and his allies represented a ‘dress rehearsal’ for how future democratic processes could be subverted (Hasen, 2022: 283). Recent developments in the capabilities of artificial intelligence, including the refinement of deepfake technology, has opened up the potential for bad actors to sway electoral outcomes through the strategic publication of manipulated footage (de Ruiter, 2021: 1317). In addition, the rise of disinformation has had a considerable impact on the foundations of truth and democracy, notably through coordinated networks of targeted falsehoods on social media platforms (Graham and FitzGerald, 2023: 2). Significantly, state secrecy could have a particularly insidious effect on the democratic legitimacy of the 2024 election, not only on a national level through the covert erosion of democratic rights, but in the form of foreign cyberattacks and influence campaigns (Tisler and Norden, 2023: 3).
To prevent further damage to democratic institutions, it is essential not only to reflect on the threats present in previous elections, but also to analyse the technological and ideological changes that have occurred since the 2020 presidential election and the resulting implications for each threat. This essay will therefore argue that the triad threat of artificial intelligence, disinformation, and state secrecy will, to varying extents, are highly likely to have a profound impact on the legitimacy of the 2024 US presidential election. As asserted by Bokat-Lindell (2021), 2024 could be the year that American democracy dies. While solutions exist to prevent this outcome, it is evident that urgent action needs to be taken to combat deceptive and secretive interference in electoral processes.
In an age of declining trust in mainstream news outlets, there has been wide debate over the detrimental impact that large-scale deception in the form of manipulated media could have on the legitimacy of democratic elections (Chesney and Citron, 2019: 1764). Deepfake technology, otherwise known as artificial intelligence (AI) generated media, is quickly becoming more sophisticated. Machine-learning algorithms can now superimpose faces and voices into video and audio recordings to the extent that that they are essentially indistinguishable with authentic footage (Chesney and Citron, 2019: 1758). The subsequent erosion of trust in the electorate could further promote conspiracy theories, damage national security and impact the outcome of elections.
Notably, the 2024 presidential election will be the first in which deepfakes have the potential to significantly influence electoral results. By the 2018 and 2020 elections, and even in the most recent midterms, deepfakes had not yet reached the level of sophistication that would allow them to be utilised for political disinformation (Wong, 2022). However, as of 2024, AI-generated content is already playing a role in politics. For example, the Republican party recently used AI to create a campaign advert which imagined the chaos that would ensue in the next four years should Biden be re-elected (Benson, 2023). Although the footage shown in the advert was clearly fake, it raised ethical considerations about how AI should be regulated in election campaigns (Weiner and Norden, 2023). Moreover, in January 2024, voters in New Hampshire received a robocall engineered to sound like Biden which encouraged them to avoid casting their ballots in the state’s presidential primary (Han, 2024).
As this technology advances, its uses could become extremely subtle, allowing it to evade detection. For example, a technique already employed in the marketing and campaigning sectors to fabricate the appearance or absence of popular support, known as ‘crowd-turfing’, could become more sophisticated (de Ruiter, 2021: 1317). It could soon be possible to alter the expressions of people listening to a candidate making a speech in real-time, so that audience members could seem to have a more negative or positive reaction to what is being said. This could influence the perceptions of viewers on a subconscious level, impacting the subtle cues that voters use to cultivate an overall impression of a presidential candidate (de Ruiter, 2021: 1317).
According to Mitchell and Thompson (1986: 325), ‘the distortion of information, as in deceit, and the suppression of information, as in secrecy, are counterparts in any system’. It is therefore evident that the damaging impact of deepfake technology on democracy can only be combated through lifting this veil of secrecy and implementing fast-acting systems that can inform viewers about the origins and veracity of deepfake footage. However, initiatives to this effect could prove challenging to implement. Due to US electoral legislation, the origins of dark money poured into election campaigns are near-impossible to identify, meaning that should a portion of this be used to finance deepfakes aimed at undermining election candidates, the sources and development of this material would likely remain unknown (Painter, 2023: 1-2).
It is also probable that foreign governments as well as non-state actors engaged in cyberterrorism will produce their own deepfakes in order to influence the outcome of the 2024 election (de Ruiter, 2021: 1317). This could have implications in the realm of international relations by straining relationships between the US and other states, for instance in the event that a deepfake is released which incites armed conflict between China and an Indo-Pacific ally of the US such as Taiwan. In domestic politics, this technology could reduce the electorate’s trust in the veracity of real campaign footage, further undermining confidence in the legitimacy of democratic institutions. Further, in the case that a fabricated video is released showing a leading candidate engaging in immoral or criminal behaviour just before the polls open, deepfakes could directly affect electoral processes. By the time that the inauthenticity of the footage is verified and this is communicated to the nation, citizens may have already cast their votes (de Ruiter, 2021: 1317).
However, the impact of deepfakes on electoral outcomes could be limited by new regulations. States as ideologically disparate as California, Minnesota, Texas and Washington have enacted new laws in recent years prohibiting or imposing restrictions on deepfakes and other misleading media in election advertisements and political messages (Weiner and Norden, 2023). In February 2024, large tech companies including Microsoft, Meta, Google, Amazon, X, OpenAI and TikTok announced an agreement to combat the interference of AI in the 2024 elections, which while remaining light on details such as how the measures would be enforced, signalled the formation of a unified front against this threat (Bond and Parks, 2024). Moreover, studies have shown that deepfake technology does not influence the public any more than other types of disinformation in the form of textual headlines or audio recordings (Barari et. al., 2023). Particularly if concerted efforts are launched to improve media literacy and awareness of deepfakes, the harmful effects of this technology could be somewhat limited. However, the underlying reasons for belief in deepfakes, such as conspiracy theories and a lack of state transparency, are deeply rooted in US politics and present a formidable challenge to address and counteract effectively.
Discussions around the impact of disinformation on the legitimacy of elections have been ongoing for several years, peaking in the aftermath of the 2016 presidential election in which a significant amount of news stories were discovered to be false (Hameleers et. al., 2020: 281). The weakening of public faith in empirical evidence has reduced much democratic discourse concerning national and global problems to first-order questions (Chesney and Citron, 2019: 1778). Significantly, the US electorate is facing increasing difficulties in accessing truthful information in the midst of the rapid propagation of so-called fake news. This phrase is better defined as disnews, due to the popularisation of the former term’s use as a label to discredit traditional news media, as highlighted by Kohring and Zimmerman (2020: 216). Disnews can be spread directly by politicians, news media and citizens. However, as political tribalism has risen in the US, news organisations have additionally capitalised on the popular demand of their readership by prioritising controversy and sensational news values over accuracy (Hameleers et. al., 2020: 284).
The absence of journalistic gatekeepers on social media has notably emboldened individuals and foreign actors to propagate disinformation through media channels. The Mueller Report details the ease in which Russian intelligence agents were able to set up social media accounts for the purpose of spreading disinformation and interfering in the 2016 election (Painter, 2023: 6). Similarly, a social media influencer was convicted for conspiracy against rights following his attempt to limit the turnout of Hillary Clinton voters (Office of Public Affairs, 2023). After the influencer posted a graphic encouraging Twitter users to cast their votes through text, at least 4,900 messages were sent to a fake telephone number on and before Election Day.
Moreover, since Elon Musk acquired X (formerly known as Twitter) in 2022, disnews and conspiracy theories have flourished on one of the most popular online sources of political information. This is exemplified by the analysis of the platform carried out by Graham and FitzGerald (2023: 2) during the first Republican primary debate and the simultaneous airing of Tucker Carlson’s interview of Trump on 23 August 2023, which revealed that a coordinated network of over 1200 accounts promoting false and misleading narratives about election fraud received over three million impressions. The loaded questions that Carlson posed to Trump were amplified by this network, revolving around the central narrative that the ongoing legal actions and indictments against Trump are part of a broader agenda orchestrated by the Democrats and the ‘deep state’. The strengthening of this narrative is highly likely to undermine the legitimacy of the 2024 election, particularly considering that 78% of Republican voters believe that the 2020 election was stolen (Bokat-Lindell, 2021). This conviction has ramifications not only for voting behaviour but also for the safe execution of electoral procedures; one in five election officials have cited threats to their lives as a job-related concern.
Bok (1982: 5-6) highlights that secrecy concerns the intentional concealment of information which ‘pre-supposes separation, a setting apart of […] keepers of a secret from those excluded.’ This demonstrates that secrecy and conspiracy are inextricably linked, as conspiracists who feel that they are ‘in on the secret’ form tight-knit communities that align themselves against what they view as a culture of deception among elites and adopt alternative ways of viewing the world in a post-truth era (Fenster, 1999: 54). This, in turn, bolsters tribalism, creating significant challenges in conveying legitimate information to conspiracists. Further, the responsibility for the rise of conspiracy theories lies just as much with the actions of elected officials as with the rapid spread of disinformation. Public trust in the US government has been severely eroded by decades of clandestine activity, with covert operations such as MK-ULTRA serving to remind the electorate that conspiracy theories can prove to be legitimate (Wilson, 2009: 2).
The strengthening of powerful conspiracy theories has directly led to democratic backsliding through the erosion of trust in governmental institutions. One such conspiracy is known as QAnon, a multifaceted theory which is based on the involvement of ‘deep state’ Democrats and other public figures in a ‘worldwide satanic domestic minor sex trafficking ring’ (Erickson et. al., 2022). Studies have shown that the prevalence of social media, the abundance of disinformation sources and the accessibility of the internet collectively boosted the expansion of the QAnon campaign to an unprecedented threat level, allowing it to enter the mainstream. Almost half of Americans had come across the conspiracy by the end of 2020, with nearly 14% identifying themselves as ‘QAnon believers’, a figure that rose to 17% in October 2021 (Erickson et. al., 2022; Smith, 2022). This data reflects that the QAnon movement has only strengthened since Trump left the White House, raising concerns about the impact this could have on the upcoming election.
Seven in ten followers of the conspiracy also believe that the 2020 election was stolen from Trump, linking to another major conspiracy known as the Big Lie (Smith, 2022). The proliferation of the Big Lie theory, most notably by Trump himself, directly resulted in the 6th of January attack on the US Capitol, in which hundreds of Trump supporters infiltrated the building in order to prevent Congress from formalising the election result (Tisler and Norden, 2023: 3). This event is symbolic of the power that the systematic disruption of democratic processes and institutions by bad actors can have on the legitimacy of elections (Kohring and Zimmerman, 2020: 216). When considering the 2024 election, it is clear that combatting disinformation is fundamental to ensuring the integrity of democratic processes.
Following Trump’s refusal to concede to Biden in 2020, threats to key democratic rights have increased, jeopardising the legitimacy of the upcoming election. These threats are made even more significant by the fact that democratic backsliding in the form of the clandestine implementation of under-publicised laws which directly erode voting and protest rights have received little coverage. In a conference organised by the Solutions Journalism Network (2023: 8.22), journalist Ari Berman discussed the rise of ‘a toxic combination of voter suppression, election subversion, gerrymandering and outright disregard for the will of the people’ in the US. This can be exemplified by the refusal of the Supreme Court to strike down a racially gerrymandered map in the state of Alabama until after an election had taken place in 2022. The state then simply ignored the Supreme Court’s ruling that it should redraw its congressional map (Levine and Witherspoon, 2023). This case demonstrates that US states, emboldened by the election-rigging rhetoric espoused by government officials, are willing to go to increasingly extreme lengths to circumvent democratic mandates.
Additionally, Republican allegations of widespread voter fraud have a well-established history, gaining momentum following the disputed election between George W. Bush and Al Gore. These claims served the purpose of providing a legitimate reason to pass voter identification legislation with the intent of rendering it more difficult for those likely to support the Democrats to register and vote, albeit with limited success (Hasen, 2022: 267). However, the post-2020 period has seen an unprecedented escalation in the delegitimisation of electoral processes, to the extent that in 2021 the US was added to the International Institute for Democracy and Electoral Assistance’s annual list of backsliding democracies (Agence France-Presse, 2021). The weakening of civil liberties and checks on government were significant factors behind its inclusion. This can be exemplified by actions taken to restrict freedom of association and assembly following the death of George Floyd in 2020 and the subsequent protests, as well as the laws enacted by legislatures under Republican control which have expanded state authority in overseeing election processes and have facilitated the potential overturning of results (Agence France-Presse, 2021; Bokat-Lindell, 2021).
Furthermore, 2022 saw election deniers running for office in secretary of state contests (Bokat-Lindell, 2021; Tisler and Norden, 2023). Some of them won or were appointed as election officials at a local level. The replacement of election workers with those who deny the legitimacy of democratic elections significantly increases the risk of the subversion of election results from the inside. In addition, AI is not only being used to create disinformation, but also to enhance government surveillance and censorship capabilities (Feldstein, 2019a). The way this technology is used is often shrouded in secrecy, leading to a ‘surveillance first, ask-permission-later system’ and, in the area of facial recognition, is beset by gender and racial biases (Feldstein, 2019b: 18-19). This poses sinister implications for the freedoms afforded to citizens in democratic societies.
According to Jones (2014: 56), transparency is not always a universally desirable quality of institutions, as ‘control over secrecy is a requirement of state legitimacy’. Some scholars even contend that elected officials have the ability to conceal information and actions just as easily in democracies as in non-democracies, as autocrats have smaller coalitions to hold them accountable (Carnegie, 2021: 214). However, others have argued that such behaviour is often illegitimate in truly democratic societies. For instance, Thompson (1999: 192) argues that state secrecy is only justified when the secrecy itself is subject to the consent of citizens and representatives. Mokrosinka (2020: 433) additionally highlights that ‘directives and policies that are meant to regulate citizens’ actions but, due to their secret character, cannot be communicated to them, cannot have authority over them.’ By viewing state actions such as gerrymandering and voter suppression through this lens, it is evident that attempts to manipulate voter behaviour via deceptive means do not have democratic legitimacy.
Systematic clandestine behaviour on the part of states, or ‘parapolitics’, is not a new phenomenon – indeed, the study of parapolitics has its intellectual foundations in the profound conservative-liberal suspicion of government that is particularly dominant in the US (Wilson, 2009: 1). However, the blatant subversion of democratic processes by elected officials in recent years represents an unprecedented departure from established norms. Notably, analysts have debated whether Trump was prevented from completely dismantling US democracy due to the strength of countervailing institutions, or because he did not possess sufficient skills to subvert them (Carothers and Press, 2022). It follows that questions are being raised about whether American democracy will be severely undermined by ‘lawyers in fine suits’ upholding authoritarian electoral restrictions.
Fundamentally, the 2024 presidential election is anticipated to be the most contentious in modern history, not least due to the fact that Trump, one of the foremost contenders, is embroiled in legal proceedings (Graham and FitzGerald, 2023: 3). The threats outlined in this essay have acted as the driving forces behind the propagation of the Big Lie conspiracy and the bolstering of distrust in democratic institutions through the dissemination of election falsehoods. While artificial intelligence-generated media and disinformation represent highly significant threats, there are possible solutions to combat them, such as enhancing media literacy and implementing robust fact-checking mechanisms. Clandestine actions by democratic states, however, remain underreported, and the relationship between secrecy and democratic accountability is increasingly contentious. Faced with a growing lack of accountability on the part of elected officials, the solutions remain less clear. Therefore, enhanced transparency of governmental institutions is urgently needed to uphold the legitimacy of electoral processes in 2024 and in future elections.
Agence France-Presse (2021) US added to list of ‘backsliding’ democracies for first time, The Guardian. Available at: https://www.theguardian.com/us-news/2021/nov/22/us-list-backsliding-democracies-civil-liberties-international.
Ari Berman and Natalia Contreras on covering threats to democracy with a solutions lens (2023) Solutions Journalism Network. 8 September. Available at: https://youtu.be/ORN3q2eVxdc?feature=shared.
Barari, S., Lucas, C. and Munger, K. (2021) Political deepfakes are as credible as other fake media and (sometimes) real media [Preprint]. doi:10.31219/osf.io/cdfh3.
Benson, T. (2023) Brace yourself for the 2024 Deepfake election, Wired. Available at: https://www.wired.com/story/chatgpt-generative-ai-deepfake-2024-us-presidential-election/.
Bellman, B. (1981) ‘The Paradox of Secrecy’, Human Studies, 4(1), pp.1-24. Available at: http://www.jstor.org/stable/20008785.
Bokat-Lindell, S. (2021) Will 2024 be the year American democracy dies?, The New York Times. Available at: https://www.nytimes.com/2021/09/30/opinion/american-democracy-2024.html.
Bok, S. (1982) ‘Approaches to secrecy’, Secrets; on the ethics of concealment and revelation, New York: Oxford University Press, pp.3-14.
Bond, S. and Parks, M. (2024) Tech Giants pledge action against deceptive AI in elections, NPR. Available at: https://www.npr.org/2024/02/16/1232001889/ai-deepfakes-election-tech-accord.
Bristow, T. (2023) Keir Starmer suffers UK politics’ first deepfake moment. it won’t be the last, POLITICO. Available at: https://www.politico.eu/article/uk-keir-starmer-labour-party-deepfake-ai-politics-elections/.
Carnegie, A. (2021) ‘Secrecy in international relations and foreign policy’, Annual Review of Political Science, 24(1), pp. 213–233. doi:10.1146/annurev-polisci-041719-102430.
Carothers, T. and Press, B. (2022) Understanding and responding to Global Democratic backsliding, Carnegie Endowment for International Peace. Available at: https://carnegieendowment.org/2022/10/20/understanding-and-responding-to-global-democratic-backsliding-pub-88173.
Chesney, B. and Citron, D. (2019) ‘Deep Fakes’, California Law Review, 107(6), pp.1753-1820.
Costas, J. and Grey, C. (2016) Secrecy at work the hidden architecture of Organizational Life. Stanford, CA: Stanford Business Books, an imprint of Stanford University Press.
de Ruiter, A. (2021) ‘The distinct wrong of deepfakes’, Philosophy & Technology, 34(4), pp. 1311–1332. doi:10.1007/s13347-021-00459-2.
Erickson, T.B., Prakash, J. and Stoklosa, H. (2022) ‘Human trafficking and the growing malady of disinformation’, Frontiers in Public Health, 10. doi:10.3389/fpubh.2022.987159.
Fenster, M. (1999) Conspiracy Theories : Secrecy and Power in American Culture, University of Minnesota Press, Minneapolis. Available from: ProQuest Ebook Central.
Feldstein, S. (2019a) How artificial intelligence systems could threaten democracy, Carnegie Endowment for International Peace. Available at: https://carnegieendowment.org/2019/04/24/how-artificial-intelligence-systems-could-threaten-democracy-pub-78984.
Feldstein, S. (2019b) ‘Types of AI Surveillance’, The Global Expansion of AI Surveillance, pp.16-21. Available at: http://www.jstor.org/stable/resrep20995.8.
Graham, T. and FitzGerald, K.M. (2023) Bots, fake news and election conspiracies: Disinformation during the Republican primary debate and the Trump interview [Preprint]. doi:10.5204/rep.eprints.242533.
Hameleers, M. et al. (2020) ‘A picture paints a thousand lies? the effects and mechanisms of multimodal disinformation and rebuttals disseminated via social media’, Political Communication, 37(2), pp. 281–301. doi:10.1080/10584609.2019.1674979.
Han, J. (2024) New Hampshire is investigating a robocall that was made to sound like Biden, NPR. Available at: https://www.npr.org/2024/01/22/1226129926/nh-primary-biden-ai-robocall.
Hasen, R.L. (2021) My new draft paper: ‘identifying and minimizing the risk of election subversion and stolen elections in the contemporary United States’ #ELB, Election Law Blog. Available at: https://electionlawblog.org/?p=124686.
Hasen, R.L. (2022) ‘Identifying and Minimizing the Risk of Election Subversion and Stolen Elections in the Contemporary United States’, Harvard Law Review Forum, 135(6), pp. 265-[i].
Jones, G.M. (2014) ‘Secrecy’, Annual Review of Anthropology, 43, pp. 53–69. Available at: http://www.jstor.org/stable/43049562.
Kamisar, B. (2023) Almost a third of Americans still believe the 2020 election result was fraudulent, NBC News. Available at: https://www.nbcnews.com/meet-the-press/meetthepressblog/almost-third-americans-still-believe-2020-election-result-was-fraudule-rcna90145.
Lessons from Pro-Democracy and Pro-Solutions Newsrooms (2023) Solutions Journalism Network. 13 September. Available at: https://youtu.be/yknZHWCH7E4?feature=shared.
Levine, S. and Witherspoon, A. (2023) How Alabama is defying the Supreme Court to discriminate against black voters, The Guardian. Available at: https://www.theguardian.com/us-news/2023/sep/01/alabama-discrimination-black-voters-gerrymandering.
Mitchell, R.W. and Thompson, N.S. (1986) Deception: Perspectives on human and Nonhuman Deceit. Albany: State Univ. of New York Press.
Mokrosinska, D. (2018) ‘Why states have no right to privacy, but may be entitled to secrecy: A non-consequentialist defense of State secrecy’, Critical Review of International Social and Political Philosophy, 23(4), pp. 415–444. doi:10.1080/13698230.2018.1482097.
Painter, R.W. (2023) ‘Deepfake 2024: Will Citizens United and Artificial Intelligence Together Destroy Representative Democracy?’, SSRN Electronic Journal [Preprint]. doi:10.2139/ssrn.4558216.
Smith, D. (2022) Belief in QAnon has strengthened in US since Trump was voted out, study finds, The Guardian. Available at: https://www.theguardian.com/us-news/2022/feb/23/qanon-believers-increased-america-study-finds.
Social Media Influencer Sentenced for Election Interference in 2016 Presidential Race (2023) Office of Public Affairs. [online] Available at: https://www.justice.gov/opa/pr/social-media-influencer-sentenced-election-interference-2016-presidential-race.
Thompson, D.F. (1999) ‘Democratic secrecy’, Political Science Quarterly, 114(2), pp. 181–193. doi:10.2307/2657736.
Tindall, A. (2023) What is the big lie?, Protect Democracy. Available at: https://protectdemocracy.org/work/what-is-the-big-lie/.
Tisler, D. and Norden, L. (2023) Securing the 2024 election, Brennan Center for Justice. Available at: https://www.brennancenter.org/our-work/policy-solutions/securing-2024-election.
Weiner, D. and Norden, L. (2023) Regulating AI deepfakes and synthetic media in the political arena, Brennan Center for Justice. Available at: https://www.brennancenter.org/our-work/research-reports/regulating-ai-deepfakes-and-synthetic-media-political-arena.
Wilson, E.M. (2009) Government of the shadows: Parapolitics and criminal sovereignty. New York: Pluto Press.
Wong, M. (2022) We haven’t seen the worst of fake news, The Atlantic. Available at: https://www.theatlantic.com/technology/archive/2022/12/deepfake-synthetic-media-technology-rise-disinformation/672519/.
Zimmermann, F. and Kohring, M. (2020) ‘Mistrust, disinforming news, and vote choice: A panel survey on the origins and consequences of believing disinformation in the 2017 German parliamentary election’, Political Communication, 37(2), pp. 215–237. doi:10.1080/10584609.2019.1686095.