Fake Biden robocall telling Democrats not to vote is likely an AI-generated deepfake
A recent robocall from a fake President Joe Biden telling New Hampshire residents not to vote was almost certainly created with artificial intelligence, according to disinformation experts and people who study the technology.
The call, which the New Hampshire attorney general’s office has described as an apparent “unlawful attempt” to suppress voters from writing in Biden’s name in the state’s Democratic presidential primary Tuesday, is of unknown origin. Experts say it appears to be a deepfake — fake audio or video created with AI and designed to mimic real people, usually without their knowledge or consent.
“All signs point to it being a deepfake,” said Ben Colman, the CEO of Reality Defender, a company that creates software to test media files to see whether they appear artificially generated.
“We never say anything’s 100% certain, because we do not have the ground truth, but it’s highly likely manipulated,” Colman said.
The voice on the robocall, first obtained by NBC News, sounds like Biden’s, though the cadence is clipped. It’s nearly impossible to pin down which AI program would have created the audio; programs that can create a moderately convincing replication of someone’s voice are widely available as phone apps and online services, usually for free or only a small fee.