Home Tech How to Protect Yourself (and Your Loved Ones) From AI Scam Calls

How to Protect Yourself (and Your Loved Ones) From AI Scam Calls

by Elijah
0 comment
How to Protect Yourself (and Your Loved Ones) From AI Scam Calls

You answer A random phone call from a family member, and they breathlessly explain how a terrible car accident occurred. They want you to send money right now or they’ll go to jail. You can hear the desperation in their voices as they plead for an immediate cash transfer. While it certainly sounds like them and the call is coming from their number, you get the feeling that something isn’t right. So you decide to hang up and call them back right away. When your relative answers your call, they say there was no car accident and they have no idea what you are talking about.

Congratulations, you’ve just successfully avoided an artificial intelligence scam call.

As generative AI tools become more capable, it will become easier and cheaper for scammers to create fake but convincing audio from people’s voices. These AI voice clones are trained on existing audio samples of human speech and can be adapted accordingly imitate almost everyone. The latest models can even speak in numerous languages. OpenAI, the creator of ChatGPT, recently announced a new text-to-speech model that could further improve voice cloning and make it more widely accessible.

Of course, bad actors use these AI cloning tools to trick victims into thinking they’re talking to a loved one over the phone, even though they’re talking to a computer. While the threat of AI-powered scams can be frightening, you can stay safe by keeping these expert tips in mind the next time you receive an urgent, unexpected call.

Remember that AI audio is difficult to detect

It’s not just OpenAI; Many tech startups are working to replicate near-perfect sounding human speech, and recent progress has been rapid. “If it had been a few months ago, we would have given you tips on what to look out for, like pregnancy pauses or some form of latency,” said Ben Colman, co-founder and CEO of Reality Defender. Like many aspects of generative AI over the past year, AI audio is now a more convincing imitation of the real thing. Any security strategies that rely on you audibly detecting strange oddities over the phone are outdated.

Hang up and call back

Security experts warn that it is quite easy for scammers to give the impression that the call is coming from a legitimate phone number. “Often, scammers will spoof the number they are calling you from, making it appear as if you are being called from that government agency or bank,” said Michael Jabbara, global head of fraud services at Visa. “You have to be proactive.” Whether it’s from your bank or a loved one, every time you get a call asking for money or personal information, you can ask for a call back. Find the number online or in your contacts and start a follow-up call. You can also try messaging them via another authenticated line of communication, such as video chat or email.

Create a secret safe word

A popular security tip that multiple sources suggested was to come up with a safe word that you can ask for over the phone, that only family members will know about. “You can even pre-negotiate with your loved ones a word or phrase they can use to prove who they really are if they find themselves in a coercive situation,” says Steve Grobman, chief technology officer at McAfee. While calling back or verifying through another means of communication is best, a safe word can be especially helpful for younger people or older family members who may otherwise be difficult to reach.

Or just ask what they had for dinner

What if you haven’t chosen a safe word and are trying to figure out if a disturbing phone call is real? Pause for a moment and ask a personal question. “It can even be as simple as asking a question that only a loved one knows the answer to,” says Grobman. “It could be, ‘Hey, I want to make sure this is really you. Can you remind me what we had for dinner last night?’ Make sure the question is specific enough that a scammer with good judgment won’t be able to answer correctly.

Understand that any voice can be imitated

Deepfake audio clones aren’t just reserved for celebrities and politicians, like the calls in New Hampshire that used AI tools to sound like Joe Biden and discourage people from going to the polls. “One misunderstanding is: ‘It can’t happen to me. No one can clone my voice,” said Rahul Sood, Chief Product Officer at PinDrop, a security firm that discovered the likely origin of the AI ​​Biden audio. “What people don’t realize is that with just 5 to 10 seconds of your voice, on a TikTok you may have made or a YouTube video from your professional life, that content can easily be used to create your clone.” With the help of AI tools, the outgoing voicemail message on your smartphone can even be enough to replicate your voice.

Don’t give in to emotional appeals

Whether it’s a pig slaughterhouse or an AI phone call, experienced scammers can build your trust in them, create a sense of urgency and find your weaknesses. “Be wary of any involvement where you experience a heightened sense of emotion, because the best scammers are not necessarily the most skilled technical hackers,” says Jabbara. “But they have a very good understanding of human behavior.” Taking a moment to think about a situation and not acting impulsively could be the moment you avoid being scammed.

You may also like