Home Australia I was sickened to discover I had been made a victim of deepfake pornography, created by AI from a single photo, writes highly respected Channel 4 broadcaster CATHY NEWMAN.

I was sickened to discover I had been made a victim of deepfake pornography, created by AI from a single photo, writes highly respected Channel 4 broadcaster CATHY NEWMAN.

0 comment
His colleagues discovered a deepfake pornographic video of Channel 4 broadcaster Cathy Newman while investigating the rise of the technology.

Sitting in front of my laptop, I watched a naked woman with my face having graphic penetrative sex in a variety of positions with a naked man. The pornographic video lasted three minutes and 32 seconds and, although it seemed grotesque to me, I forced myself to watch the whole thing. I needed to understand exactly how realistic these images are and also recognize how easy it was for people to access them online.

Because as perfect as the recording seemed, it wasn’t me at all: my face had been imposed on another woman’s body using artificial intelligence (AI) to create what is known as “deepfake” pornography.

The video was discovered by my colleagues at Channel 4 while investigating the alarming, exponential rise of deepfake porn for a special report that aired last month.

His colleagues discovered a deepfake pornographic video of Channel 4 broadcaster Cathy Newman while investigating the rise of the technology.

Of the 4,000 celebrities they found appearing in deepfake porn videos online, 250 were British, and one of them was me.

None of the celebrities we approached for comment on this would make it public. Although I was disappointed, I understood: they didn’t want to perpetuate the abuse they had been victims of by drawing more attention to it.

But for our research to have maximum impact, I knew I needed to speak up.

In my 18 years as a Channel 4 journalist, I have sadly seen many distressing images of sexual violence. So while I was nervous about being a part of the story, I figured I’d be used to the content of the video itself.

But in reality it left me disturbed and tormented. She had been raped by a perpetrator whom, as far as I know, I had never met, and I was the victim of a very modern crime that risks having a corrosive effect on future generations of women.

I also felt justified in my decision to go public, because earlier this month the Government announced that the creation of these sexually explicit deepfakes will be a criminal offense in England and Wales.

I understand that Laura Farris, Minister for Victims and Protection, was partly motivated to take action after observing our research. This comes after sharing this type of content was banned in the Online Safety Bill last year.

My colleagues were already investigating deepfake pornography when, in January, fake and explicit images of singer Taylor Swift went viral on X/Twitter, with one image being viewed 47 million times before being removed.

The alarming magnitude of the problem suddenly became clear. We found that the four most popular deepfake porn sites hosting doctored images and videos of celebrities had nearly 100 million views in just three months, with more deepfake porn videos created in 2023 than every year since 2017 combined.

The videos have been viewed in total more than 4.2 billion times.

You might think that a certain degree of technical expertise is required to make them, but it’s incredibly easy and is primarily done using smartphone “nudification” apps – there are over 200 available. Users submit an image (a single photograph of someone’s face taken from social media is all it takes) and this is used to create a horrifyingly realistic explicit image.

Due to the large number of celebrity photographs online, we hear about high-profile personalities becoming victims more often. They include US Congresswoman Alexandria Ocasio-Cortez, who this month described the trauma of discovering she had been attacked while at a meeting with aides in February, and Italian Prime Minister Giorgia Meloni, who is seeking compensation after fake videos of her were uploaded online.

But arguably the biggest victims are the hundreds of thousands of women without a public platform to report images as deepfake: the women who might be in a meeting or job interview and don’t know if the people in front of them have seen and been deceived by the images. fake footage.

The recreation of the announcer. Of the 4,000 celebrities they found appearing in deepfake porn videos online, 250 were British, and one of them was me, Cathy writes.

The recreation of the announcer. Of the 4,000 celebrities they found appearing in deepfake porn videos online, 250 were British, and one of them was me, Cathy writes.

I spoke to one such victim, Sophie Parrish, 31, a florist and mother of two from Merseyside, whose deepfake porn video was uploaded to a website by someone close to her family, over which the men then photographed themselves masturbating. She was physically ill when she found out about it and its impact on her since then has been profound.

A beautiful woman has lost confidence and now does not want to wear makeup for fear of attracting attention. She almost blames herself, although there is obviously no blame. And yet she had the courage to go public last February, asking the Ministry of Justice to make it illegal to create and share explicit images without consent.

In truth, I wasn’t entirely surprised when my colleagues told me about my video’s existence, given that, as a woman in the public eye, I’ve been relentlessly trolled for years.

After my interview with Jordan Peterson, the Canadian psychologist famous for his divisive views on political correctness, free speech, gender identity, and racial privilege, went viral in 2018, I received death threats. They called me ‘c***’, ‘b****’ and ‘p****’ and my eldest daughter, who was 13 at the time, was distressed when she came across a meme on Instagram in which I had been imposed head. in a pornographic image.

So it’s understandable that my colleagues wanted me not to feel under any pressure to watch the video they’d made of me, while my editor was worried about its emotional impact. But I felt that I owed it to each victim of this crime – especially Sophie Parrish, whom I had interviewed the day before – to understand for myself what it felt like to be attacked and to speak out.

Of course, I have access to professionals who can help me process the material, but many women (and 98 percent of deepfake porn victims are women) do not. I was worried about the backlash from my daughters, who are now 19 and 15, but like all teenagers, they are aware of the type of AI content proliferating online and were interested in how we can navigate it.

After seeing the report, they told me they were proud. My husband was too, although he understandably didn’t want to see my unedited video, and I didn’t want him to either.

While the pornographic meme my daughter saw in 2018 was crude, I discovered that, six years later, the digital terrain has changed and the lines between what is real and what is not have been blurred.

The only saving grace of my surprisingly sophisticated deepfake video was that the AI ​​can’t (yet) replicate my curly hair, and the bleached blonde bob was clearly not mine. However, I found the images of me having sex with a man who, presumably, had also not given consent for his image to be used, to be incredibly invasive.

But I also wanted to be filmed watching it, to show in our reporting the extent of the impact it had on me.

Even though it had obviously been done remotely, by a perpetrator whose motives I can only speculate, I felt violated.

Anyone who knows me would realize that I wouldn’t participate in the making of a porn video, and one advantage of getting older is that you’re less worried about childish abuse. But its existence undermines and dehumanizes women. It is a deliberate attempt to belittle and degrade. Even if they know they are watching deepfake porn, men don’t seem to care.

Seventy percent of viewers visit deepfake porn sites through search engines. When we contacted Google, a spokesperson said that they understand how distressing the images can be, that they are developing additional security measures to help people protect themselves, and that victims can remove pages featuring this content from search results.

Since our investigation, two of the largest deepfake sites, including the one hosting my video, have blocked UK users from accessing their content. But the video is still available through a virtual private network (a VPN) that hides the user’s location.

The Government’s legislation to ban the creation of these videos – which will lead to a criminal record, a fine and possible jail time, and which will be introduced as an amendment to the Criminal Justice Bill – is groundbreaking, but experts with those I have spoken to have already warned. of possible legal loopholes.

Victims will have to prove that the video was made with the intention of causing distress, which can be difficult, and there is a question mark over whether, if you ask an app to create the explicit content, you are off the hook. of the law.

Another drawback is that many of these videos are made outside the UK where our legislation does not apply, so global action is also needed.

Then there’s the matter of timing: Ofcom, the broadcasting watchdog, is still consulting on the legal rules that made it illegal to share these videos. It will not come into force until the end of the year, by which time hundreds of thousands more women will have become victims.

Regulation is also far behind the technology that enables this crime, so ultimately it comes down to the big tech companies spreading this explicit content, which is attracting viewers and advertisers to their platforms for profit.

They are much more powerful than individual jurisdiction and I see no evidence that they are addressing the issue with the urgency it requires.

I believe it is within their power to immediately stop the circulation of these videos, but it is not in their best interest to do so.

I was worried about possible backlash for being a part of this story, but the overwhelming reaction has been supportive, whether on social media, in my email inbox, or on the street.

And a month later, as depressed as I am about the corrosive effect of AI on future generations of women, I’m glad I went public.

You may also like