A Georgia mother became the latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter saying she had been kidnapped and demanding a $50,000 ransom for her safe return.
So-called imposter scams, in which a scammer pretends to be someone to steal money, are the most common scams in the US. The Federal Trade Commission reported.
Debbie Shelton Moore received a six-minute phone call from what she thought was her 22-year-old daughter Lauren, who lives apart from her.
‘It just looked so much like her. It was 100 percent believable,” Moore said. Enough to almost give me a heart attack from sheer panic.
The scam demanded money for the daughter’s return, but she was safe the entire time and had not been kidnapped.
A Georgia mother became the latest to face a shocking AI phone scam, using the voice of her 22-year-old daughter saying she had been kidnapped and demanding a $50,000 ransom for her safe return.

Debbie Shelton Moore (pictured right) received what ended up being a six-minute phone call from what she thought was her 22-year-old daughter Lauren (pictured left), who lives apart from her.
DailyMail.com previously reported that scammers can impersonate a victim’s voice using just a three-second snippet of audio, often stolen from social media profiles.
It is then used to call a friend or family member to convince them that they are in trouble and urgently need money.
Shelton Moore initially thought Lauren was in a car accident and calling for help until he heard three male voices.
‘The man had said: ‘Your daughter has been kidnapped and we want $50,000.’ Then they had her crying, like, ‘Mom, Mom’ in the background. It was her voice and that’s why she was driving me crazy,” she said. 11 alive.
Shelton Moore grew more nervous when she checked the location of Lauren’s phone and discovered that she was stuck on a freeway.
‘YO [was] thinking she was in the back because he said, ‘We’ve got her in the back of the truck.
Fortunately, her husband, who works in cyber security, overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed that she was not in danger and revealed that his wife was being scammed.
“Everything was a bit of a blur because all I was thinking was, ‘How am I going to get my daughter? How the hell are we supposed to get her money?””
They eventually called the county sheriff’s office and confirmed Lauren’s safety.

Shelton Moore had initially thought that Lauren was in a car accident and calling for help until he heard three male voices.

Fortunately, her husband, who works in cyber security, overheard the conversation and sensed something was up. He FaceTimed Lauren, who confirmed that she was not in danger and revealed that his wife was being scammed.
“My heart is pounding and I’m shaking,” she recalled the moment she received the call. I’m shaking thinking about it right now.
The scam is something that has affected a surprising number of Americans. One in four respondents in April McAfee Survey said they had some experience with an AI voice scam and one in ten said they had been personally attacked.
“I’m very aware of scammers and scams and IRS scams and bogus jury duty,” Moore said. “But of course, when you hear his voice, you’re not going to think straight and you’re going to panic.”
The police recommend having a ‘safe phrase’ that you and your family can use to show that they are not artificial.
The rise of accessible and sophisticated artificial intelligence is making scams faster and easier to pull off, said Steve Grobman, McAfee’s chief technology officer.
“One of the things that’s most important to recognize with the advances in AI this year is that it’s very much about making these technologies available to a lot more people, including really enabling scale within the cyber-actor community,” Grobman said.
“Cybercriminals can use generative AI to fake voices and deepfakes in ways that used to require much more sophistication.”

Keep an eye out: AI technology is fueling an explosion in voice cloning scams, experts warn (file image)
Vice President Kamala Harris also told CEOs of major tech companies in May that they have an increasing moral responsibility to limit the harm to society from their AI products.
Vonny Gamot, Head of EMEA at McAfee: ‘Advanced AI tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and trick a close contact into sending money.
“Artificial intelligence brings incredible opportunities, but with any technology there is always the possibility that it could be used maliciously in the wrong hands,” he added.
“This is what we are seeing today with the access and ease of use of artificial intelligence tools that help cybercriminals scale their efforts in increasingly compelling ways.”