Home Tech Nearly 4,000 celebrities found to be victims of deepfake pornography

Nearly 4,000 celebrities found to be victims of deepfake pornography

0 comments
My Blonde GF: a disturbing story of deepfake pornography

More than 250 British celebrities are among thousands of famous people falling victim to deepfake pornography, an investigation has found.

A Channel 4 News analysis of the five most visited deepfake sites found almost 4,000 famous people were listed, including 255 from Britain.

These include actresses, TV stars, musicians and YouTubers, whose faces have been superimposed on pornographic material using artificial intelligence.

The survey found that the five sites received views from 100 million people within three months.

Channel 4 News presenter Cathy Newman, who was among the victims, said: “This feels like a violation. It’s really sinister that someone who organized all of this, I can’t see him, and he can see this kind of imaginary version of me, this fake version of me.

As of January 31, under the Online Safety Act, sharing such images without consent is illegal in the UK, but creating content is not. The legislation was passed in response to the proliferation of deepfake pornography created by AI and apps.

My Blonde GF: a disturbing story of deepfake pornography

In 2016, researchers identified a deepfake pornographic video online. In the first three quarters of 2023, 143,733 new deepfake porn videos were uploaded to the 40 most used deepfake porn sites – more than all previous years combined.

Sophie Parrish, 31, from Merseyside, discovered fabricated nude images of her had been posted online before the legislation was introduced.

She told Channel 4 News: “It’s just very violent, very degrading. It’s like women don’t mean anything, we’re worthless, we’re just a piece of meat. Men can do whatever they want. Before that, I trusted everyone.

A consultation is underway on how the Online Safety Act, which has suffered numerous delays, will be implemented and enforced by broadcasting watchdog Ofcom.

An Ofcom spokesperson said: “Illegal deepfakes are deeply worrying and damaging. Under the Online Safety Act, businesses will have to assess the risk of such content circulating on their services, take steps to prevent it from appearing and act quickly to remove it when they become aware of it.

“Even though the rules are not yet in effect, we encourage businesses to implement these measures and protect their users now. »

A Google spokesperson said: “We understand how distressing this content can be and are committed to strengthening our existing protections to help those affected.

“Under our policies, users can have pages with this content and including their image removed from search. And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search, including tools to help users protect themselves at scale, as well as ranking improvements to process this content more broadly.

Ryan Daniels of Meta, owner of Facebook and Instagram, said: “Meta strictly prohibits child nudity, content that sexualizes children, and services featuring non-consensual AI-generated nude images. Although this app (which creates deepfakes) remains widely available on various app stores, we have removed these ads and the accounts behind them.

You may also like