AI-generated robocall impersonates Biden in an apparent attempt to suppress votes in New Hampshire

Author: ALI SWENSON AND WILL WEISSERT
Published: Updated:
biden
CREDIT: AP Photo by Evan Vucci

The New Hampshire attorney general’s office on Monday said it was investigating reports of an apparent robocall that used artificial intelligence to mimic President Joe Biden’s voice and discourage voters in the state from coming to the polls during Tuesday’s primary election.

Attorney General John Formella said the recorded message, which was sent to multiple voters on Sunday, appears to be an illegal attempt to disrupt and suppress voting. He said voters “should disregard the contents of this message entirely.”

A recording of the call reviewed by The Associated Press generates a voice similar to Biden’s and employs his often-used phrase, “What a bunch of malarkey.” It then tells the listener to “save your vote for the November election.”

“Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again,” the voice mimicking Biden says. “Your vote makes a difference in November, not this Tuesday.”

It is not true that voting in Tuesday’s primary precludes voters from casting a ballot in November’s general election. Biden is not campaigning in New Hampshire and his name will not appear on Tuesday’s primary ballot after he elevated South Carolina to the lead-off position for the Democratic primaries, but his allies are running a write-in campaign for him in the state.

It’s not known who is behind the calls, though they falsely showed up to recipients as coming from the personal cellphone number of Kathy Sullivan, a former state Democratic Party chair who helps run Granite for America, a super-PAC supporting the Biden write-in campaign.

Sullivan said she alerted law enforcement and issued a complaint to the attorney general after multiple voters in the state reported receiving the call Sunday night.

“This call links back to my personal cell phone number without my permission,” she said in a statement. “It is outright election interference, and clearly an attempt to harass me and other New Hampshire voters who are planning to write-in Joe Biden on Tuesday.”

It was unclear how many people received the call but a spokesperson for Sullivan said she heard from at least a dozen people who received it. The attorney general’s office encouraged anyone who has received the call to email the state Justice Department’s election law unit.

Gail Huntley, a 73-year-old Democrat in Hancock, New Hampshire, who plans to write in Biden’s name on Tuesday, said she received the call at about 6:25 p.m. on Sunday.

She instantly recognized the voice as belonging to Biden but quickly realized it was a scam because what he was saying didn’t make sense. Initially, she figured his words were taken out of context.

“I didn’t think about it at the time that it wasn’t his real voice. That’s how convincing it was,” she said, adding that she is appalled but not surprised that AI-generated fakes like this are spreading in her state.

White House press secretary Karine Jean-Pierre confirmed Monday that the call “was indeed fake and not recorded by the president.” Biden’s campaign manager, Julie Chavez Rodriguez, said in a statement that the campaign is “actively discussing additional actions to take immediately.”

“Spreading disinformation to suppress voting and deliberately undermine free and fair elections will not stand, and fighting back against any attempt to undermine our democracy will continue to be a top priority for this campaign,” she said.

The apparent attempt at voter suppression using rapidly advancing generative AI technology is one example of what experts warn will make 2024 a year of unprecedented election disinformation around the world.

Generative AI deepfakes already have appeared in campaign ads in the 2024 presidential race, and the technology has been misused to spread misinformation in multiple elections across the globe over the past year, from Slovakia to Indonesia and Taiwan.

“We have been concerned that generative AI would be weaponized in the upcoming election and we are seeing what is surely a sign of things to come,” said Hany Farid, an expert in digital forensics at the University of California, Berkeley, who reviewed the call recording and confirmed it is a relatively low-quality AI fake.

As AI technology improves, the federal government is still scrambling to address it. Congress has yet to pass legislation seeking to regulate the industry’s role in politics despite some bipartisan support. The Federal Election Commission is weighing public comments on a petition for it to regulate AI deepfakes in campaign ads.

Though the use of generative AI to influence elections is relatively new, “robocalls and dirty tricks go back a long ways,” said David Becker, a former U.S. Department of Justice attorney and election law expert who now leads the Center for Election Innovation and Research.

He said it’s hard to determine whether the main intent of the New Hampshire calls was to suppress voting or simply to “continue the process of getting Americans to untether themselves from fact and truth regarding our democracy.”

“They don’t need to convince us that what they’re saying, the lies they’re telling, are true,” he said. “They just need to convince us that there is no truth, that you can’t believe anything you’re told.”

Katie Dolan, a spokeswoman for the campaign of Rep. Dean Phillips of Minnesota, who is challenging Biden in the Democratic primary, said Phillips’ team was not involved and only found out about the deepfake attempt when a reporter called seeking comment.

“Any effort to discourage voters is disgraceful and an unacceptable affront to democracy,” Dolan said in a statement. “The potential use of AI to manipulate voters is deeply disturbing.”

The Trump campaign said it had nothing to do with the recording but declined further comment.

Copyright ©2024 Fort Myers Broadcasting. All rights reserved.

This material may not be published, broadcast, rewritten, or redistributed without prior written consent.