The specter of a plethora of disinformation brewed using artificial intelligence (AI) was already hanging over the race for the White House, but recent fake calls imitating President Joe Biden's voice are deepening experts' fears.
• Also read: Not surprisingly, Biden largely wins the Democratic primary in South Carolina
• Also read: Investigation: Manipulated calls impersonating Biden are deterring voters from voting Tuesday
“What a load of nonsense!” proclaims the robocall message that usurps the president’s vote, digitally manipulated (deepfake) to encourage New Hampshire residents not to vote in the Democratic primary .
This news prompted local authorities to quickly launch an investigation into a possible “illegal attempt to disrupt the vote in late January.”
Researchers who specialize in disinformation fear there will be an increase in embezzlement in this crucial election year, thanks to the many tools now available to clone votes using AI.
Mainly because, in their opinion, they are cheap, easy to use and difficult to detect.
“This is definitely the tip of the iceberg,” Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.
“We can assume that there will be many more deepfakes this election period,” he warns.
Voice cloning software developed by startup ElevenLabs was used to create Joe Biden's fake calls, according to an analysis by Pindrop.
“Political chaos”
A case that comes at a time when campaign teams from both camps are polishing up their AI-powered paraphernalia to float their positions and investors are pouring millions of dollars into vote cloning startups.
ElevenLabs did not respond to repeated requests from AFP. The website offers a free clone version “to instantly create natural voices in any language using AI.”
The usage recommendations state that it is permitted to clone the voice of political figures in a “humorous or parodic” manner that is “clearly recognizable as such” to the listener.
Joe Biden's manipulated calls have exposed the possibility that malicious people are using artificial intelligence to discourage voters from going to the polls.
“The moment for political deepfake has arrived,” notes Robert Weissman, president of the civil rights group Public Citizen.
“MPs must act quickly to take protective measures, otherwise we are heading for political chaos,” he told AFP. “The New Hampshire deepfake illustrates the many ways digital manipulation can cause confusion.”
November's presidential election is widely considered to be the first AI-powered election in the United States. Experts are concerned about their impact on trust in the democratic process, and voters have difficulty distinguishing fact from fiction.
“Election Integrity”
According to Tim Harper, an analyst at the Center for Democracy and Technology, audio manipulation generates the most fear because it represents the most “vulnerable” point.
“It's easy to clone a voice using AI, but difficult to detect,” he told AFP.
Because AI software spreads faster than software for detecting counterfeits.
AI intervenes in an already tense and extremely polarized political landscape, for example allowing anyone to claim that information is based on “made up” facts, adds Wasim Khaled, general manager of Blackbird.AI, an online disinformation detection platform, added.
The Chinese company ByteDance, owner of TikTok, recently introduced StreamVoice, an AI software that can convert its user's voice into any other in real time.
“This time ElevenLabs was used, but it is more than likely that a different generative device will be used in future attacks,” says Vijay Balasubramaniyan, advocating for the creation of “defenses.”
Like other experts, he recommends integrating audio watermarks or digital signatures into this software as a protection, but also as a regulation to be able to reserve them for verified users.
Even under these conditions, there is a risk that controls will become “really difficult and very expensive,” says analyst Tim Harper, who calls for more “investment” in “election integrity” given this risk.