It captured everything from the way I tend to ‘Umm’ and ‘Aah’ between words to the way I raise my voice when I ask a question.
New Hampshire residents received a strange call telling them to skip the primary elections, and although it sounded like Joe Biden on the other end, it was an AI clone of his voice.
An anonymous scammer used the Eleven Labs app to replicate Biden’s voice in the attack last month. I tested the app to see how believable an AI cloned voice is.
The AI-generated voice tricked a friend into believing the message was actually from me.
‘Why did you send me a voice note?’ my friend responded to the message. “You usually just send emails, but it’s nice to hear from you!”
My father also admitted that the fake voice would have fooled him and when my wife heard a short message and said, ‘Oh my God, I want to throw him off a bridge.’
I’ve heard of these types of apps before, but perhaps I had naively assumed that the clones would always have telltale signs and clues, whereas with this voice I’m 100 percent sure I could scam everyone from close family to friends and colleagues. .
Using the Eleven Labs app requires a 10-minute audio recording of your voice, but the more you feed the AI, the more accurate it becomes.
The results captured everything about my tone and word use: how I tend to say ‘umm’ and ‘aah’ between words and how I increase my tone when asking questions.
It captured everything from the way I tend to ‘Umm’ and ‘Aah’ between words to the way I raise my voice when I ask a question.
In the New Hampshire attack, the same app was used to tell residents: “Voting this Tuesday only empowers Republicans in their quest to elect Donald Trump again.” His vote makes a difference in November, not this Tuesday,’ the victims heard over the phone.’
The scary thing is that the recordings can be generated in real time, so you could easily have a conversation or carry out a fake message campaign like what happened last month.
For example, I could call my father and ask him to transfer money to me in an emergency.
What’s more, anyone can use the app against me to clone my voice and commit fraud under my identity.
For anyone with a large amount of public voice recordings, such as actors and politicians like President Biden, there is already enough voice data “in the wild” to create an eerily convincing clone.
Eleven Labs is just one of several apps that can do this (and it’s worth noting that they have a clever security feature before you can create one of the ‘Pro’ voices, where you have to say a few words on the screen, like a Captcha) . by your voice).
But scams, in which cloned voices are used to scam people, are becoming “more and more frequent,” said Adrianus Warmenhoven, a cybersecurity expert at NordVPN.
Research by cybersecurity firm McAfee found that nearly a quarter of respondents have experienced some type of AI voice scam, or know someone who has been attacked, and 78 percent lost money as a result.
Last year, elderly couple Ruth and Greg Card were called by their grandson and told he was in jail and they needed money, but the voice was a fake AI.
Microsoft also demonstrated a text-to-speech AI model that same year, which can synthesize anyone’s voice from a three-second audio sample.
Warmenhoven said the technology behind “cloned” voices is improving rapidly and is also falling in price, making it accessible to more scammers.
To access Eleven Labs ‘Professional’ voices, you must pay a monthly subscription of $10.
Other AI applications may have fewer protections, making it easier for criminals to commit fraud.
“A user’s vulnerability to this type of scam really depends on the number of voice samples that criminals can use for voice cloning,” Warmenhoven said.
The technology behind ‘cloned’ voices is improving rapidly and is also falling in price, making it accessible to more scammers. To access ‘Professional’ voices from Eleven Labs, you must pay a monthly subscription of $10
‘The more they have, the more convincing voice clones they can make. Politicians, public figures and celebrities are therefore very vulnerable, as criminals can use recordings of events, media interviews, speeches, etc.
He also warned that people who upload videos of themselves to social networks such as Instagram and TikTok could also be at risk.
‘There is also a huge amount of video content that users voluntarily upload to social media. Therefore, the more publicly available videos users have on social media, the more vulnerable they are,” Warmenhoven continued.
‘Be careful what you post on social media. Social media is the largest publicly available resource of voice samples for cybercriminals.
You should be concerned about what you post on social media and how it could affect your safety.’
He also said that scammers could also try to clone your voice by having phone conversations with you to collect voice data for cloning.
‘Scammers aren’t always looking to extort money and data on the first call. Collecting enough voice samples for voice cloning could also be the goal of the call, Warmenhoven explained.
‘Once you know you are having a call with the scammer, hang up and don’t give them the opportunity to record your voice. The more you talk during the call, the more samples of your voice the criminals will have and the better clones they will produce.