Photos circulated on social media earlier this summer showing former U.S. President Donald Trump hugging and even kissing Dr. Anthony Fauci. The images weren’t real of course, and they weren’t the work of some prankster either. The images, which were generated with the aid of artificial intelligence-powered “Deep Fake” technology, were shared online by Florida Governor Ron DeSantis’ rapid response team.
It was part of a campaign to criticize Trump for not firing Fauci, the former top U.S. infectious disease official who pushed for the Covid-19 restrictions at the height of the pandemic.
Deep Fakes being employed in the 2024 election has already been seen as a major concern, and last month the Federal Election Commission began a process to potentially regulate such AI-generated content in political ads ahead of the 2024 election. Advocates have said this is necessary to safeguard voters from election disinformation.
The Real AI Threat
For years, there have been warnings about the danger of AI, and most critics have suggested the machines could take over in a scenario similar to science fiction films such as The Terminator or The Matrix, where they literally rise up and enslave humanity.
Yet, the clear and present danger could actually be AI used to deceive voters as we head into the next primary season.
“Deep Fakes are almost certain to influence the 2024 elections,” warned Dr. Craig Albert, professor of political science and graduate director of the Master of Arts in Intelligence and Security Studies at Augusta University.
“In fact, the U.S. Intelligence Community expected these types of social media influence operations to occur during the last major election cycle, 2022, but they did not occur to any substantial effect,” Albert noted.
However, the international community has already witnessed sophisticated Deep Fakes in the Russia-Ukraine War. Although the most sophisticated of these came from Ukraine, it is certain that the government of Russia took notice and is planning on utilizing these in the near future, suggested Albert.
“Based on their history of social media information warfare and how they have impacted U.S. elections generally over the past near decade, it is almost assured that the U.S. can expect to see this during the 2024 election cycle,” he added.
Too Much Trust in Social Media
The threat from AI-generated content is magnified due to the fact that so many Americans now rely on social media as a primary news source. Videos from sources that paid to be “verified” on platforms such as X (formerly Twitter) and Facebook can go viral quickly, and even when other users question the validity of that content from otherwise unvetted sources, many will still believe it to be real.
It is made worse because there is so little trust in politicians today.
“The danger for the individuals is this practice can do a lot of damage to the image and trustworthiness of the person attacked and eventually there will be laws put in place that would more effectively penalize the practice,” suggested technology industry analyst Rob Enderle of the Enderle Group.
“Identity theft laws might apply now once attorneys start looking into how to mitigate this behavior,” Enderle continued. “It is one thing to accuse an opponent of doing something they didn’t do, but crafting false evidence to convince others they did it should be illegal but the laws may have to be revised to more effectively deal with this bad behavior.”
Combating Deep Fakes
The political candidates—at all levels—shouldn’t wait for the FEC to act. To restore election integrity, there should be calls for anyone seeking office not to employ Deep Fakes or other manipulated videos and photos as a campaign tool.
“Beyond a doubt, all U.S. officials should agree to not engage in any social-media or cyber-enabled influence campaigns including Deep Fakes within the domestic sphere or for domestic consumption,” said Albert. “Candidates should not endorse propaganda within the U.S. to impact voting behavior or policy construction at all. Engaging in Deep Fake creation or construction would fit within this category and ought to be severely restricted for candidates and politicians for ethical and national security reasons.”
Yet, even if the candidates make such pledges, there will still be domestic and foreign operators who employ the technology. All of the political campaigns will likely be watching for such attacks, but voters will also need to be vigilent as well. Much of this is actually pretty straightforward and obvious.
“One should never trust unverified, non-official sources of videos and sound bites,” added Albert. “These are all easy to fake, manipulate, and distort, and for candidate pages, easy to create cyber-personas that aren’t authentic. If videos, sound bites, or social media posts appear and seem to cause some form of emotional reaction in the public realm, that is a signal to be slow to judge the medium until it has been verified as authentic.”