Major tech companies including Google, Apple, and Discord have allowed people to quickly sign up for harmful “nudity” websites, which use artificial intelligence to remove clothing from real photos and make victims appear “naked” without their consent. More than a dozen of these deepfake websites have been using tech companies’ login buttons for months.
A WIRED analysis found that 16 of the largest nude websites use login infrastructure from Google, Apple, Discord, Twitter, Patreon, and Line. This approach allows people to easily create accounts on deepfake websites (which gives them a veneer of credibility) before paying for credits and generating images.
While bots and websites that create non-consensual intimate images of women and girls have been around for years, the number has increased with the introduction of generative AI. This type of abuse of “nudity” is alarming. widespreadwith teenagers supposedly creating Tech companies have been slow to address the scale of the problems, critics say, and websites appear at the top of search results. paid ads promoting them on social media and apps that appear on app stores.
“This is a continuation of a trend that normalizes sexual violence against women and girls by big tech,” says Adam Dodge, an attorney and founder of EndTAB (Ending Technology-Enabled Abuse). “Login APIs are tools of convenience. We should never turn sexual violence into an act of convenience,” he says. “We should be putting walls around access to these apps, and instead we’re giving people a drawbridge.”
The login tools analyzed by WIRED, which are implemented through common APIs and authentication methods, allow people to use existing accounts to join deepfake websites. Google’s login system appeared on 16 websites, Discord’s on 13, and Apple’s on six. The X button was on three websites, and Patreon’s and messaging service Line’s appeared on the same two websites.
WIRED does not name the websites, as they enable abuse. Several of them are part of broader and proprietary networks by the same people or companies. Login systems have been used despite the fact that technology companies in general having rules that state developers cannot use their services in ways that could cause harm, harassment or invade people’s privacy.
When contacted by WIRED, spokespersons for Discord and Apple said they had removed developer accounts connected to their websites. Google said it will take action against developers when it finds its terms have been violated. Patreon said it bans accounts that allow the creation of explicit images, and Line confirmed it is investigating but said it could not comment on specific websites. X did not respond to a request for comment on how its systems are being used.
In the hours after Jud Hoffman, Discord’s vice president of trust and safety, told WIRED that it had terminated websites’ access to its APIs for violating its Developer PolicyOne of the nude websites posted on a Telegram channel that authorization through Discord was “temporarily unavailable” and said it was attempting to restore access. That nude service did not respond to WIRED’s request for comment on its operations.
Rapid expansion
Since deepfake technology emerged in the end of 2017The amount of non-consensual intimate videos and images being created has grown exponentially. While it is more difficult to produce videos, the creation of images through “nudity” or “strip” websites and apps has become commonplace.
“We need to be clear that this is not innovation, this is sexual abuse,” says David Chiu, a San Francisco city attorney who recently opened a Lawsuit against nude and nudity websites and its creators. Chiu says the 16 websites at the center of her office’s lawsuit have had about 200 million hits in the first six months of this year alone. “These websites engage in the horrific exploitation of women and girls around the world. These images are used to intimidate, humiliate and threaten women and girls,” Chiu alleges.