Home Money A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

0 comment
A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

Another image on the site showed a group of young teens who appear to be in high school: a boy taking a selfie in what appears to be a gym with two girls, smiling and posing for the photo. The boy’s features were obscured by a Snapchat lens that made his eyes so big that they obscured his face.

Captions on the apparently uploaded images indicated that they included images of friends, classmates and romantic partners. “My girlfriend,” reads the caption, which shows a young woman taking a selfie in the mirror.

Many of the photos featured influencers popular on TikTok, Instagram and other social media platforms. Other photos appeared to be Instagram screenshots of people sharing images from their daily lives. One image showed a young woman smiling with a dessert topped with a festive candle.

Several images appeared showing people who were complete strangers to the person taking the photo. One photo taken from behind shows a woman or girl not posing for a photo, but simply standing near what appears to be a tourist attraction.

Some images in the feeds reviewed by WIRED were cropped to remove the faces of women and girls, with only their chests or crotches visible.

Huge audience

During an eight-day period of monitoring the site, WIRED saw five new images of women appear on the Home feed and three on the Explore page. Statistics on the site show that most of these images have amassed hundreds of views. It’s unclear whether all images submitted to the site make it to the Home or Explore feed, or how views are tabulated. Each post on the Home feed has at least a few dozen views.

Photos of celebrities and people with large followers on Instagram top the list of ‘most viewed’ images on the site. The most viewed people of all time on the site are actor Jenna Ortega with more than 66,000 views, singer-songwriter Taylor Swift with more than 27,000 views and an influencer and DJ from Malaysia with more than 26,000 views.

Swift and Ortega have been the target of deepfake nudes before. The spread of fake nudes of Swift on X in January sparked a moment of renewed discussion about the impact of deepfakes and the need for more legal protections for victims. This month, NBC reported that Meta had been hosting for seven months ads for a deepnude app. The app boasted of its ability to “undress” people, using a photo of Jenna Ortega from when she was 16 years old.

In the US, there is no federal law addressing the spread of false, non-consensual nude images. A handful of states have passed their own laws. But AI-generated nude images of minors fall into the same category as other child sexual abuse material, or CSAM, said Jennifer Newman, executive director of the NCMEC’s ​​Exploited Children’s Division.

“If it’s indistinguishable from an image of a living victim, of a real child, then that’s child sexual abuse material to us,” Newman said. “And we will treat it as such as we process our reports, as we turn these reports over to law enforcement.”

You may also like