CNN  — 

President Barack Obama’s 2008 election campaign has often been celebrated as the first to effectively use social media as a mobilization tool to capture the White House. In the 15 years since, the technology has gone from being a novel addition to a political campaign to transcending every aspect of one.

Now, a transformative and largely untested technology looks set to revolutionize political campaigning: artificial intelligence. But the computer-generated content, which blurs the line between fact and fiction, is raising concerns ahead of the 2024 presidential election.

The Republican National Committee threw down the gauntlet last week when it released a 30-second advertisement responding to President Joe Biden’s official announcement that he would seek reelection in 2024.

The ad, uploaded to YouTube, imagined a dystopian United States after the reelection of the 46th president, presenting stark images of migrants flooding across the US border, a city on lockdown with soldiers on the streets, and Chinese jets raining bombs on Taiwan.

But none of the foreboding images in the video were real – they were all created using AI technology.

Last week, CNN showed the ad to potential voters in Washington, DC. While some were able to identify that the images in it were fake, others were not. After watching scenes of heavily armed military personnel patrolling the streets of San Francisco during a lockdown sparked by surging crime and a “fentanyl crisis,” one person CNN spoke to was left wondering if the imagined episode had actually happened.

Therein lies the problem, said Hany Farid, a digital forensic expert and professor at the University of California, Berkeley.

The Republican National Committee released a 30-second ad featuring AI-generated imagery in response to President Joe Biden's official announcement that he would seek reelection in 2024.

Imagined realities and deceptive ads are nothing new in political campaigns. Lyndon B. Johnson’s 1964 presidential campaign brought forth the so-called “Daisy Girl” attack ad, which imagined a nuclear apocalypse were his opponent Barry Goldwater to win.

But AI muddies the waters much further, said Farid.

“We enter this world where anything can be fake – any image, any audio, any video, any piece of text. Nothing has to be real,” he said “We have what’s called a liar’s dividend, which is anybody can deny reality.”

Farid pointed to the infamous release of the “Access Hollywood” tape in the final days of the 2016 presidential campaign, in which Trump bragged in graphic terms about being able to sexually assault women. The footage led to the rare occasion when Trump has apologized for his actions. But now, he said, Trump could more easily claim the audio was faked.

Imran Ahmed, CEO of the Center for Countering Digital Hate, told CNN he didn’t think the RNC’s use of AI to illustrate a dark vision of America’s future was particularly troubling. But he expressed concern it could help open the way for more nefarious uses of the technology, like making it appear as if a politician said or did something they really hadn’t.

“We need a mutual disarmament, a nonproliferation treaty, when it comes to the use of generative AI by political parties because it makes a mockery of our democratic elections,” Ahmed said.

Geoffrey Hinton the lead 0502
'Godfather of AI' says AI could kill humans and there might be no way to stop it
01:18 - Source: CNN

But while some Democrats mocked the RNC for using AI to imagine an apocalyptic world where Biden is reelected, there’s no indication Democrats will pledge not to use this technology themselves.

The breakneck pace of AI development has largely allowed its use to remain unregulated, but campaigns that do exploit the technology still must steer clear of some restrictions. Texas has a law on its books that places some limitations on the use of so-called deepfakes in the weeks leading up to an election.

Matthew Ferraro, a Washington-based cybersecurity lawyer who has been tracking how lawmakers are trying to catch up to this burgeoning technology, said time will tell if there will be any successful enforcement actions of these laws. But campaigns, he said, can for the most part avoid running afoul of legislation by adding a disclaimer to content that is created through AI.

The RNC ad released last week included the small on-screen disclaimer, “Built entirely with AI imagery.” The label was faint, however, and some of the people CNN showed the video to did not spot it on their first watch.

AI optimists will point out there are positive use cases for political campaigns. During the 2020 elections in India, one candidate’s video message was translated into multiple languages and dialects in an attempt to reach more voters.

AI is also being used by campaigns to sort through millions of data points in order to more effectively target voters.

“For an average donor, we know about 500 to 1,000 different things about you,” said Martin Kurucz, CEO of Sterling Data Company, which works with Democratic campaigns. “A lot of that is your political interests, your demographics, your income.”

Kurucz said to imagine a massive spreadsheet with millions of rows of voters and a thousand data points about each of them. “There is no human being that is able to synthesize” that information, he said, but AI can.