According to a new federal report, the threat presented by artificial intelligence systems is genuine and presents a significant issue heading into the 2024 election.
The analysis, compiled by the Department of Homeland Security and obtained by ABC News, outlines how, with less than six months until Election Day, next-generation technologies designed to fuel advancement also offer opportunities for abuse, potentially jeopardizing the democratic system’s foundation of elections.
“As the 2024 election cycle progresses, generative AI tools likely provide both domestic and foreign threat actors with enhanced opportunities for interference by aggravating emergent events, disrupting election processes, or attacking election infrastructure,” according to the document released on May 17. According to the bulletin, these tools can be used to “influence and sow discord” in approaching US elections by individuals who see them as “attractive” and “priority” targets.
“This isn’t an issue for the future. “This is a problem of today,” said John Cohen, a former intelligence chief at the Department of Homeland Security and current ABC News contributor. “Foreign and domestic threat actors have fully embraced the internet, and they are increasingly using advanced computing capabilities like artificial intelligence to conduct their illegal operations.”
Those aiming to target elections have already done so “by conducting cyber-enabled hack-and-leak campaigns, voice spoofing, online disinformation campaigns, and threatening or plotting attacks against symbols of US elections,” according to the advisory.
And now, as the report warns, generative AI’s new capabilities can be exploited to disrupt future elections. These tools can be used “to confuse or overwhelm voters and election staff to disrupt their duties” by creating or sharing “altered” or deepfaked pictures, videos, or audio clips “regarding the details of Election Day, such as claiming that a polling station is closed or that polling times have changed, or to generate or promote other tailored false information online.”
On the eve of the New Hampshire primary in January, a robocall that appeared to impersonate President Joe Biden’s voice circulated, encouraging recipients to “save your vote” for the November general election rather than participating in the state’s primary, according to audio obtained by ABC News at the time.
The “generative AI-created audio message” was notably mentioned in the DHS research, which also stated that “the timing of election-specific AI-generated media can be just as critical as the content itself, as it may take time to counter-message or debunk the false content permeating online.”
“This may be one of the most difficult elections for Americans to navigate finding ground truth in our lifetimes,” said Elizabeth Neumann, a former DHS assistant secretary and current ABC News contributor. “It’s not just whether a politician is telling you the truth; you won’t even be able to trust your own eyes at the images you’re seeing in your social media feeds, in your emails, and possibly even in traditional media if they don’t do a good enough job vetting the material.”
The 2024 election has been defined by increasingly caustic rhetoric and the intermixing of aggressive campaign trail hyperbole and courtroom theatrics, as Trump faces four criminal proceedings in which he claims innocence. Hate speech, misinformation, and disinformation are common on social media and in real life, even as fast-growing technology remains vulnerable, according to experts. Meanwhile, battles in the Middle East and Ukraine continue to polarize Americans’ views on foreign policy, with protests breaking out on major college campuses around the country.
“Threat actors can attempt to exploit deepfake videos, audio content, or other generative AI media to amplify discontent,” according to the DHS report. “A well-timed deepfake or piece of AI-generated media for a targeted audience could spur individuals to take action, which may result in violence or physical disruptions directed toward the elections or candidates.”
The danger landscape has become “more diverse and complex,” and preserving the integrity of US elections is more difficult than ever because of the increasing complexity of artificial intelligence, senior intelligence officials told senators last Wednesday.
“Using every tool we have is critical as the challenge is expanding,” Director of National Intelligence Avril Haines told a Senate committee hearing on threats to the 2024 elections. “There are an increasing number of foreign actors, including non-state entities, who are looking to engage in election influence activities,” she stated. She further stated that “relevant emerging technologies, particularly generative AI and big data analytics, are increasing the threat by enabling the proliferation of influence actors who can conduct targeted campaigns.”
“Innovations in AI have enabled foreign influence actors to produce seemingly authentic and tailored messaging more efficiently and at greater scale,” Haines stated. “Even as the threat landscape becomes increasingly complicated, I believe that the United States government has never been better prepared to address the challenge,” thanks in part to lessons gained during the 2016 presidential election.
Authorities at all levels must be prepared to defend against artificial intelligence distributing fake news at this critical time, according to experts.
“One of the most critical things we need to do right now is educate and prepare the people. Because they will be the ones being targeted with this stuffโthe intended target is the general publicโand the goal is to affect how people behave,” Cohen explained.
“State and local officials must have a plan in place so that when this content is recognized, they can counteract and correct erroneous information using reputable sources of communication.”Cohen noted that once that content is published, it will quickly propagate throughout the online media ecosystem and must be addressed immediately.” Law enforcement and the security community have been reluctant to respond to the continuously changing threat landscape. We are still utilizing yesterday’s techniques to deal with today’s threat.” It’s like bringing a knife to a gunfight.”