DHS warns of threats to election posed by artificial intelligence

One expert stated, “This is the problem of today.”

A new federal assessment says that the threat posed by artificial-intelligence programs is real, and poses a serious problem for the 2024 elections.

ABC News obtained the analysis, which was compiled by Homeland Security. It outlines that, less than six months from Election Day, the next-generation technologies, designed to drive advancement, also present opportunities for misuse, which could threaten the democratic system’s foundation of elections.

The document from May 17 said that “as the 2024 cycle advances, generative AI will likely give both domestic and international threat actors enhanced opportunities to interfere by aggravating emerging events, disrupting electoral processes, or attacking the election infrastructure.” The bulletin stated that those tools could be used to “influence” and “sow discord” during upcoming U.S. election by those who see them as “attractive and priority” targets.

Will you join Elon, Vivek and Me?
1776 Coalition Sponsored
Will you join Elon, Vivek and Me?

I want gov’t to work for the people, so I joined the DOGE Caucus. Now that Deep State, Establishment & Liberal spenders have a target on me. Fortunately, I’ve got allies like Elon Musk and Vivek Ramaswamy working with me in this mission. This is LTGen Jack Bergman. Please join me in the fight to make gov’t accountable to the citizens..

This is not an issue for the future. John Cohen, former chief of intelligence at the Department of Homeland Security and ABC News contributor, said that this is a current problem. “Foreign threat actors and domestic threats have fully embraced internet and are increasingly using advanced computer capabilities, such as artificial intelligence, to conduct their illegal activities.”

Bulletin: Already, those who are trying to influence elections have already done so by “conducting cyber-enabled hacking and leak campaigns, voice spoofing campaigns, online disinformation campaign, and threatening, or plotting, attacks against symbols of US elections”, the bulletin stated.

The analysis warns that the innovative capabilities of generative AI could be exploited against future elections to “confuse or overwhelm voters or election staff in order to disrupt their duties.” These tools can be abused by creating “altered” images, videos, or audio clips, “regarding details of Election Day” such as claiming a polling place is closed, or that voting times have changed.

ABC News obtained audio from the time that a robocall claiming to be the voice of Joe Biden was circulated on the eve before the New Hampshire primaries in January. The call encouraged recipients to “save their vote” rather than take part in the primary.

The DHS analysis specifically noted that “generative AI-created sound messages” were being used. It also stated “the timing for election-specific AI generated media can be as important as the content, as it could take time to debunk or counter-message the false content spreading online.”

Elizabeth Neumann is an ABC News contributing editor and former DHS assistant secretary during Donald Trump’s first year in office. You won’t be able trust the images in your social media feeds or emails, and you may not even believe the politicians if the media doesn’t vet the material.

Trump is facing four criminal charges in which he claims innocence. The race for 2024 has been marked with increasingly toxic language and the mingling of hyperbole on the campaign trail and courtroom drama. Experts say that hate speech, misinformation, and disinformation is rampant in social media, as well as in the real world, despite rapidly evolving technology. While wars continue in the Middle East, Ukraine and elsewhere, Americans are divided on foreign policy. The conflicts have reverberated across major U.S. colleges campuses.

The DHS analysis stated that “threat actors may attempt to use deepfake audio, video, or other AI-generated media to increase discontent.” A well-timed deepfake, or AI-generated piece of media for a specific audience, could prompt individuals to take actions that can result in violence or disruptions toward the election or candidates.

Top intelligence officials warned lawmakers on Wednesday that the threat landscape is “more complex and diverse” and that protecting the integrity and fairness of U.S. election faces greater challenges now than ever before due to artificial intelligence’s increasing sophistication.

Avril A. Haines, Director of National Intelligence, told senators at a hearing on threats to 2024’s elections that “using every tool available is crucial as the challenges are expanding.” She said that “an increasing number of non-state actors are interested in engaging in election influence activities,” adding “relevant new technologies, such as generative AI and Big Data Analytics, are increasing the risk by enabling influence actors to conduct targeted campaigns.”

Haines continued, “AI innovations have allowed foreign influence agents to produce more tailored and authentic messages at a greater scale.” I believe that, despite the fact that the threat landscape has become more complex, the U.S. Government is better equipped to deal with the challenges. This is in part due to the lessons learned during the 2016 presidential elections.

Experts said that authorities at all levels need to be prepared to combat fake news disseminated by artificial intelligence at this sensitive moment.

“Educating and preparing the general public is one of the most critical things we can do right now.” The public will be the target of this content, and the goal is to influence the behavior of the public,” Cohen explained.

State and local officials should have a plan so that, when they detect this content, they can use reliable sources of communication to correct and counter inaccurate information. Cohen said that once the content is released, it will spread quickly across online media and must be stopped immediately. “Our security community and law enforcement have been slow in adapting to this rapidly changing threat environment. We are still using strategies from yesterday to combat a modern threat. It’s like bringing a sword to a gunfight.”