Home US My nude images were shared online when I was 14 and inspired me to fight for other survivors – here’s why I think Big Tech is to blame

My nude images were shared online when I was 14 and inspired me to fight for other survivors – here’s why I think Big Tech is to blame

0 comments
Leah Juliett, now 27, has become an advocate against the abuse enabled by technology.

Aged just 14, the nude images Leah Juliett sent to a boy on Facebook were shared online, first with other children at school and then on anonymous internet message boards.

Juliett, who identifies as non-binary and uses “they” pronouns, told DailyMail.com: “It all started on an iPhone. Then it circulated on Facebook.

“The abuse images he had made then found their permanent home on an anonymous image forum called Anon IB. That website still exists today.”

The advocate, now 27, says the experience was devastating and inspired her to become an activist fighting what she sees as the inaction of big tech companies to prevent image-based child sexual abuse.

Leah Juliett, now 27, has become an advocate against the abuse enabled by technology.

They continued: “When this happened to me, when I was 14, I wanted to die. I tried to die. I can’t say that enough. My mission is to fight for accountability for Big Tech and for justice for survivors.”

In 2017, they launched the March Against Revenge Pornography across the Brooklyn Bridge, beginning a journey as an advocate against technology-enabled abuse that ultimately led Juliett to the White House.

Juliett is now campaigning for the Heat Initiative, which aims to hold Apple accountable for the dissemination of abusive images on the company’s iCloud.

They said: “I really used my shame as a force for social good. But I’m only 27. I didn’t want or expect this to be my life. When I was little, I wanted to be a singer.

‘But because this is my life, and because it is sadly still this way for so many vulnerable teens and children across our country and around the world, I still carry the trauma with me.

“It’s a deep-rooted part of who I am and a core reason I do the work I do. But I’m stronger now. I’ve created a toolbox to reclaim the shame I experienced and use it for good.”

Juliet He told this website that since 2017, the language around the issue has changed enormously.

Juliett said: ‘The whole landscape of the issue (of revenge porn) has changed since… when I first marched across the Brooklyn Bridge.

‘We don’t use that term anymore because I didn’t do anything to justify revenge against my body. And non-consensual nudity is not pornography.

“We’re talking about sexual abuse based on images and child sexual abuse material. These are more accurate terms to describe the real crimes that happen to children every day across the country.”

They added that “millions” of Internet users around the world are victims of similar abuses and “the telephone is the transmission mechanism.”

The key to defeating image-based abuse, they told DailyMail.com, is bipartisan legislation and education.

But Big Tech is also part of the problem.

Juliett said: “It’s an important moment for us to look up and recognise that we can’t fix the problem at the waterhole. We have to fix it at the source. And in my work, and in my experience over the last decade as a survivor and an expert in this field, I’ve recognised that that source is the iPhone.

‘What people don’t realize is that these tech companies, including Apple, especially Apple, are not simply innovation labs, as Apple likes to refer to itself, they are companies that deliver products and services.’

Unlike grocery stores, for example, which are not allowed to sell products that poison people, there is little legislation around Big Tech, Juliett believes.

They added: ‘These are companies that provide services to people and people are suffering severe harm at the hands of their products.

“I personally believe there are many things they can do to prevent this kind of harm. And there is a very clear reason why they don’t do it: because they continually prioritize profit over people.”

Data from the National Center for Missing and Exploited Children (NCMEC) suggested that Apple had documented 267 cases of child sexual abuse material (CSAM) worldwide between April 2022 and March 2023.

The number of iPhone users worldwide is estimated to be over one billion.

When Juliett was 14, nude images she sent to a boy were shared online.

When Juliett was 14, nude images she sent to a boy were shared online.

Juliett told this website: ‘They could offer a more robust reporting mechanism on their platforms. For example, we know that Meta has a strong track record of reporting to the National Center for Missing and Exploited Children.

‘Apple, on the other hand, doesn’t have nearly as strong a history of complaints, but we know abuse is happening in iCloud.’

Apple said in 2021 that it would implement ‘NeuralHash’, an algorithm designed to detect and remove CSAM in iCloud.

But several months later, the program was suspended due to privacy concerns.

Juliett said: ‘The most basic thing they could do today is to start a basic hash, hash match detection and iCloud, which basically turns a known CSAM fragment into a unique string of numbers through an algorithm. It turns the image into a kind of fingerprint and then compares it to a list of other fingerprints.

“You could do it. Start it today and save children’s lives today by detecting known images of child sexual abuse.”

In a company response to the Heat Initiative regarding its reversal, Apple’s director of user privacy and child safety, Erik Neuenschwander, said: “Child sexual abuse material is abhorrent, and we are committed to breaking the chain of coercion and influence that makes children susceptible to it.”

However, he said, after working with privacy and security experts, digital rights groups and child safety advocates, Apple determined it could not continue with its CSAM scanning mechanism, even one specifically built to protect privacy.

Neuenschwander wrote: ‘Scanning every user’s private data stored in iCloud would create new threat vectors that data thieves could find and exploit. It would also create the potential for a slippery slope of unintended consequences.

‘Scanning one type of content, for example, opens the door to mass surveillance and could create a desire to seek out other encrypted messaging systems across all types of content.’

Juliett, now 27, said the experience was devastating.

Juliett, now 27, said the experience was devastating.

DailyMail.com reached out to Apple for comment and was directed to an earlier statement from Apple to the Heat Initiative.

The statement said: ‘Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it.

‘We are proud of the contributions we have made so far and intend to continue working collaboratively with child safety organisations, technologists and governments on lasting solutions that help protect the most vulnerable members of our society.

‘With regard to helping children stay safe, we have made significant contributions toward this goal by developing a number of innovative technologies.

‘As you point out, we decided not to pursue the proposal of a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago, for several good reasons.

‘After consulting extensively with child safety advocates, human rights organizations, privacy and security technologists, and academics, and considering scanning technology from virtually every angle, we concluded that it was not practically possible to implement it without ultimately compromising the security and privacy of our users.

‘Companies often use cloud scanning of personal data to monetize their users’ information. While some companies have justified these practices, we have chosen a very different path: one that prioritizes the security and privacy of our users. In our view, scanning every user’s private content stored in iCloud could have serious unintended consequences for our users.’

The full statement can be read here here.

Juliett campaigns against image-based abuse in a variety of ways, including through poetry.

Juliett campaigns against image-based abuse in a variety of ways, including through poetry.

But Juliett said she will keep fighting.

She told DailyMail.com: ‘I tell a lot of stories through poetry. And I will continue to use my voice to tell my story and shout my poems… wherever the wind takes me until I see large-scale reform in the tech sector.

‘When I started the March Against Revenge Porn in 2016, it felt like a very lonely fight. But 10 years later, I realized that I didn’t have to be alone. I don’t have to be alone.

“I am now part of an incredible group of survivors and allies. And if I were to lead the same march today, I know I would have hundreds of survivors, friends, by my side. Going public with my story has been incredibly difficult. But I know that this is what I was born to do.”

You may also like