Home Tech A predator used her 12-year-old girl’s face to make porn. She helped pass a law to make it a crime.

A predator used her 12-year-old girl’s face to make porn. She helped pass a law to make it a crime.

0 comments
A predator used her 12-year-old girl's face to make porn. She helped pass a law to make it a crime.
Kaylin Hayman in Ventura, California, on October 3. Photograph: Leafy Yun Ye/The Guardian

Last year, Kaylin Hayman went to a Pittsburgh court to testify against a man she had never met who had used her face to take pornographic photos using artificial intelligence technology.

Kaylin, 16, is a child actress who starred in the Disney show Just Roll With It from 2019 to 2021. The perpetrator, a 57-year-old man named James Smelko, had targeted her because of her public profile. She is one of approximately 40 of his victims, all of them child actors. In one of the images of Kaylin presented as evidence at trial, Smelko used her face from a photo posted on Instagram when she was 12, working on the set, and superimposed it on another person’s naked body.

“I’ve had my fair share of uncontrollable crying because I don’t understand how some people are so evil,” she tells The Guardian in an interview. “I can never understand that.”

Kaylin lives in Ventura, California, and Smelko was residing in Pennsylvania when he committed these crimes against her. He was shocked when he learned that his case could only go to trial because it was an interstate crime. Possession of depictions of child sexual abuse is a crime under United States federal law. But under California state law, it was not considered illegal.

Kaylin turned her horror into action. This year, she became a strong public advocate for California’s new bill, AB 1831, which expands the scope of existing child sexual abuse material (CSAM) laws to include images and videos digitally altered or generated by artificial intelligence. In June, he testified in support of the bill at the state Capitol in Sacramento.

“I talked about how I felt violated and that I was absolutely shocked that this wasn’t already a crime in California,” Kaylin says. “California is a very important part of the acting industry and there are many children who were not protected from this crime.”

In late September, California Governor Gavin Newsom signed the measure into law. Child predators who create such material can face prison sentences and fines of up to $100,000 in the state.

While the new law focuses on AI in the hands of child predators, other factors in Kaylin’s life put her at risk of encountering Smelko or people like him, according to her and her parents, Mark and Shalene Hayman.

The Hayman family in Ventura, California, on October 3. Photograph: Leafy Yun Ye/The Guardian

Kaylin was 10 years old when she first started her Instagram account. the social network requires that its users must be at least 13 years old to register with the exception of parent-managed accounts. Smelko downloaded photos from her profile to create sexual images that combined her face with the naked bodies of other girls and women.

“Disney created their Instagram account specifically to promote the show and themselves,” says Mark. “But when these companies employ these kids and force them to post there and don’t support them, that’s where the biggest problem lies.”

This support should include training on how to deal with harassment and account blocking, as well as counseling, she says. Kaylin also blames Disney.

“Disney’s PR team had all the Disney kids and I sign up for an app. They used to send us clips to post on Instagram every week when an episode came out,” says Kaylin. “It all started with my work and them planting that seed. “I would like them to take some responsibility, but that hasn’t happened yet.”

In recent years, men have harassed Kaylin through her Instagram and TikTok accounts by sending her nude photos. She reported the spam messages to both social media companies, but says no action has been taken.

“She’s certainly had her fair share of creepy stalkers who continue to make fun of her,” Shalene says.

The California State Capitol in Sacramento. Photography: Backyard Productions/Alamy

Mark believes that Sag-Aftra, the Hollywood actors’ union, also needs to be more proactive in educating its members about the risks of predators using AI and social media to victimize public figures. Both parents periodically review Kaylin’s accounts, which she still uses and has access to.

“We read a lot of comments and think, ‘What’s wrong with people?’, but I don’t know if we can avoid it. It’s hard to be in this industry and not be on social media,” Shalene says. “I would like to see social media companies do some responsible censorship and protections.”

In recent years, Instagram has announced several initiatives to increase protection for its users under 16, including parental controls and measures to determine who can send them messages. In September, the company announced it would make all accounts of users under 18 private by default, a move praised by child safety advocates. The same restrictions apply to verified minor accounts, per Meta guidelines.

“There are so many inappropriate images circulating on Instagram. I just don’t understand why children can be sent away,” says Kaylin, who turns 17 this month. “Instagram should say, ‘No, that’s not allowed’ and delete it. But that doesn’t happen and I don’t understand it.”

Meta said in a statement: “We have detailed and robust policies against nudity and child exploitation, including real images and those created with GenAI.”

Kaylin Hamlin testifies in court in support of AB 1831 at a public safety committee hearing in Sacramento, California, in April. Photo: Courtesy of the Ventura County District Attorney’s Office

“SAG-AFTRA has been educating, negotiating and legislating about the dangers of deepfake technology since at least 2018,” said Jeffrey Bennett, general counsel for SAG-AFTRA. Bennett highlighted the guild’s publication of a magazine on deepfakes and participation in panels and articles published on the topic.

Disney had no comment.

CSAM circulation is increasing online. Predators have used photo editing software in the past, but recent advances in artificial intelligence models offer easily accessible opportunities to mass-produce more realistic child abuse images. In 2023, the National Center for Missing and Exploited Children (NCMEC), a US-based clearinghouse for global CSAM reporting, received 36.2 million reports of online child abuse, 12% more than the previous year. The majority came from Meta.

While the majority of these reports received were related to real-life photographs and videos of children being sexually abused, NCMEC also received 4,700 reports of images or videos of sexual exploitation of children made by generative AI. The organization has criticized AI companies for not actively trying to prevent or detect the production of CSAM.

Kaylin says discovering that her face had been used to create CSAM marked the end of her childhood innocence. Now she is more nervous for her safety and that of other children and teenagers she knows.

“If I see a man or someone looking at me a little funny or weird, I’m always nervous,” she says. “I’m always thinking about the worst that can happen in certain situations. I think it’s something that young women have had to get used to. It’s unfortunate that I had to wake up at 16 years old. I guess it’s part of life,” he adds.

“I’m always thinking about the worst that can happen in certain situations,” says Kaylin Hayman. Photograph: Leafy Yun Ye/The Guardian

A year ago, her testimony at Smelko’s trial meant she regained some control over the situation, she says. In court, as she concentrated on answering the prosecutor’s questions and looked toward the jury, she cast a quick glance at the stranger on trial for sexually exploiting her.

“When I got to see him, it seemed like he had a very sad life and probably stayed inside for a long time because he wasn’t a first-time offender,” he says. After testifying, Smelko was found guilty of two counts of possession of child pornography.

Kaylin is determined to continue acting and wants to appear in movies one day. But now she is focused on finishing her last year of high school and on her work defending online child exploitation. The ordeal has also awakened a new ambition in her. She wants to go to law school so she can one day become a children’s rights lawyer.

“I am very lucky that my case was not worse. “I know a lot of people have it worse than me,” he says. “I’m trying to add a little good to something so bad.”

In the US, call or text child help abuse hotline at 800-422-4453 or visit your website for more resources and report child abuse or DM for help. For adult survivors of child abuse, help is available at ascasupport.org. In the United Kingdom, the NSPCC offers support to children on 0800 1111 and adults concerned about a child on 0808 800 5000. The National Association for People Abused as Children (Napac) offers support to adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the children’s helpline on 1800 55 1800, or brave hearts at 1800 272 831, and adult survivors can contact Blue Knot Foundation at 1300 657 380. Other sources of help can be found at International Children’s Helplines

You may also like