AI Startup ElevenLabs Suspends Account Responsible for Biden Audio Deepfake
Artificial intelligence (AI) startup ElevenLabs has suspended the user account responsible for creating an audio deepfake of US President Joe Biden. The deepfake featured Biden urging people not to vote in the New Hampshire primary and was made using ElevenLabs’ technology, according to Pindrop Security Inc., a voice-fraud detection company that analyzed the audio.
After being made aware of Pindrop’s findings, ElevenLabs launched an investigation and ultimately traced the deepfake to its creator. The user’s account was subsequently suspended. ElevenLabs, which specializes in replicating voices in over two dozen languages using AI software, has declined to comment on the matter.
Earlier this week, the startup announced a successful $80 million financing round from investors including Andreessen Horowitz and Sequoia Capital, which has valued the company at $1.1 billion. ElevenLabs CEO Mati Staniszewski emphasized in an interview last week that the company removes audio that impersonates voices without permission. However, the company allows voice clones of public figures like politicians if the clips are clearly intended as parody or mockery.
The deepfake robocall featuring Biden urging people to save their votes for the US elections in November has raised concerns among disinformation experts and elections officials. It not only highlights the ease of creating audio deepfakes but also the potential for bad actors to use the technology to discourage voter participation.
The New Hampshire Attorney General’s office has launched an investigation into the unlawful attempt to disrupt the New Hampshire Presidential Primary Election and suppress voters. It is unclear whether ElevenLabs shared information about the user’s account, which requires a credit card to access voice cloning features, with the authorities.
Bloomberg News received a copy of the deepfake audio and attempted to determine the technology used to create it. While the ElevenLabs’ speech classifier tool indicated only a 2% likelihood that it was synthetic or created using their technology, other deepfake tools confirmed it was a deepfake but couldn’t identify the specific technology. Pindrop’s researchers, after cleaning and analyzing the audio, concluded that it was likely created using ElevenLabs’ technology.
Voice cloning technology has significant potential for scale and personalization, allowing people to be fooled into believing they are hearing local politicians or high-ranking officials. Tech investors are pouring money into AI startups focused on developing synthetic voices, videos, and images to transform the media and gaming industry.
With this incident involving a deepfake of Joe Biden, experts predict that similar incidents will occur during the upcoming general election. The issue underscores the need for awareness and action to combat the spread of disinformation.
In response to the deepfake incident, ElevenLabs has suspended the account responsible, demonstrating a commitment to addressing misuse and preventing the technology from being employed in harmful ways. As the risks associated with deepfakes become increasingly apparent, there is a growing necessity for proactive measures to protect against their potential negative impact.