Home Money How does YOUR bank protect you from the rising wave of AI voice clone scams?

How does YOUR bank protect you from the rising wave of AI voice clone scams?

0 comments
How does YOUR bank protect you from the rising wave of AI voice clone scams?

Scammers are using artificial intelligence technology to replicate the voices of victims’ friends or family and use them to extort money.

More than a quarter of people have already been targeted by these AI voice cloning scams at least once in the last year.

And now millions more could be at risk of becoming trapped, data shows.

Scammers use AI voice cloning techniques to force their victims to send money

It comes as the Payment Systems Regulator confirmed it would reduce the proposed fraud refund limit from £415,000 to £85,000 with the rules coming into force on October 7.

This is Money asked banks how they are combating the rising wave of AI voice cloning scams and what they are doing to protect customers.

Why are AI voice cloning scams a cause for concern?

An AI voice cloning scam is a sophisticated type of scam in which scammers use voice cloning technology to replicate a person’s voice from a short snippet of audio.

Scammers can easily and cheaply capture and create deepfake audio online in just a few minutes.

The audio clips used in AI voice cloning scams can be easily captured from a video that someone has uploaded to the Internet or social media, or even from a voicemail message.

It is said that scammers only need three seconds of audio to clone your voice.

They can identify their victims’ relatives and use the cloned voice to make a phone call, voice message or voicemail to them, asking them for money they urgently need, for example due to an accident or to pay the rent.

What are banks doing to protect customers?

Rob Woods, fraud and identity specialist at LexisNexis Risk Solutions, said: “AI-powered deepfake scams, such as voice cloning, are a growing concern for UK banks as they are an effective way to convince to the victims that they urgently need to earn money. transfer it to a friend or family member in need.’

The problem with these scams is that by getting the victim to authorize the payment, the scammers effectively bypass all of the strict authentication steps the bank puts in place to prevent them from stealing your money.

“The challenge for banks is understanding how to detect fraudulent transfer requests, versus those that are genuine,” Woods continues.

“There are a number of risk signals available that can help, such as AI-based behavioral biometrics that analyze how a phone is used and live call detection. Banks use these types of signals to create risk models that help detect when fraud might occur.” be underway.”

While criminals are increasingly finding ways to use AI to scam people, banks are also using it to fight fraud and have been doing so for the past 20 years.

Woods said: “Three large banks that introduced AI models to tackle scams saw an average 260 per cent increase in detected fraud.”

LexisNexis Risk Solutions could not reveal which three banks have introduced artificial intelligence models to tackle scams.

Santander

Santander uses machine learning models, powered by a company called Lynx Tech, to fight card and payment fraud.

Lynx Tech’s platform uses AI to understand customer transactional behavior and detect fraud.

It says its system examines 66 billion transactions and protects 300 million customers from fraud each year.

A Santander spokesperson said: “We have been using AI for payment detection, behavior detection and a variety of other use cases for several years.

‘Reported AI scams are quite difficult to identify, as many times customers are unaware that AI has been involved in the scam.

“We have a comprehensive set of checks and balances in place to detect and prevent the use of AI voice cloning.”

Research from Santander found that more than half of Britons have not heard of the term deepfake or do not fully understand what it means.

While only 17 percent of people are confident that they can easily identify a deepfake video.

On a national scale

One of the ways Nationwide Building Society protects its customers from AI Voice cloning scams is by not allowing payments via phone banking. If a customer wanted to make a payment to a friend or family member, they could make the payment at a branch.

Nationwide also uses AI to analyze transaction data.

A Nationwide spokesperson said: ‘Nationwide does not allow payments to be made through telephone banking, but we still monitor suspicious voice activity to keep our customers safe.

‘AI and advanced analytics form an important part of our multi-layered fraud prevention framework.

‘We are concerned about the growth of these scams and are working hard to ensure our customers remain protected. This includes the use of advanced analytics and specific voice controls. It is important for consumers to be aware of attacks that may directly affect them, such as people calling and claiming to be family or friends.’

Barclays

Barclays invests in multi-layered security systems that help protect customers. He says these typically prevent several thousand fraudulent transaction attempts each day.

This includes a sophisticated transaction profiling system that is unique to each client.

A Barclays spokesperson said: “For each of the more than 50 million payments our UK customers make each month, our fraud detection systems and machine learning models determine in less than a second whether it is likely to be it is a fraudster and not the customer, or if our customer appears to be at risk of being scammed.

“If the transaction appears risky, additional checks are presented to the customer before payment is released.”

“In addition to our technical prevention, we work tirelessly to help provide the public with information and tools to detect and stop fraud and scams, including warning messages throughout the checkout process, scam education through in-app notifications, networking channels social media, press and a dedicated website.’

Starling Bank

Starling launched a Safe Phrases campaign, in support of the government’s fraud campaign to raise awareness of AI voice cloning scams among customers in response to their rise.

People are encouraged to agree a strong phrase or password with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them when they call to ask them to send them money.

Then, if someone is contacted by someone claiming to be a friend or family member who doesn’t know the phrase, they can be immediately alerted to the fact that it’s probably a scam.

Santander uses AI powered by Lynx Tech to fight card and payment fraud

Santander uses AI powered by Lynx Tech to fight card and payment fraud

monzo

Monzo introduced new fraud protections for customers earlier this year.

One of them was a trusted contacts feature where customers can choose a friend or family member to verify any bank transfers and savings withdrawals that exceed their set daily allowance.

It involves customers giving consent for selected friends and family to see some details about the transactions they are making. Monzo will then ask them to confirm that it is really you and check that it looks safe.

The idea behind this is that as someone who knows the customer, their friends and family will be able to alert them if something seems suspicious. For example, if they know that you are not planning large purchases.

NatWest

A NatWest spokesperson said: “AI voice cloning scams are a threat which we recognize and monitor, using internal and external technical experts to ensure we have robust authentication and detection capabilities to prevent this type of abuse and that evidence exists. of this protection”. ‘

“We recognize the opportunity for scammers and the rapid evolution of technologies that are creating more accurate clones of individual voices, and that these synthetic voices can be used to manipulate a customer through social engineering or can be used in an effort to impersonate our clients before the bank, to obtain access to banking services or client funds

“Improving privacy controls on social media is important, but people can also help protect themselves by considering what information they share publicly on social media.”

Woods said: ‘Banks and other financial services are now under even more pressure to stop APP fraud as a result of the PSR’s new scam refund rules.

We also contacted HSBC and Lloyds for comment, but both chose not to do so.

SAVE MONEY, MAKE MONEY

5.09% on cash for Isa investors

Investment boost

5.09% on cash for Isa investors

Investment boost

5.09% on cash for Isa investors

Account rate increase with 90 days notice

Savings rate of 5.27%

Account rate increase with 90 days notice

Savings rate of 5.27%

Account rate increase with 90 days notice

No account fee and free stock trading

free share offer

No account fee and free stock trading

free share offer

No account fee and free stock trading

Flexible Isa now accepting transfers

4.84% cash Isa

Flexible Isa now accepting transfers

4.84% cash Isa

Flexible Isa now accepting transfers

Get £200 back in trading fees

Trading Fee Refund

Get £200 back in trading fees

Trading Fee Refund

Get £200 back in trading fees

Affiliate links: If you purchase a This is Money product you may earn a commission. These offers are chosen by our editorial team as we think they are worth highlighting. This does not affect our editorial independence.

Some links in this article may be affiliate links. If you click on them, we may earn a small commission. That helps us fund This Is Money and keep it free to use. We do not write articles to promote products. We do not allow any commercial relationship to affect our editorial independence.

You may also like