A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women by simply displaying the program's target target with clothing.
The app is called DeepNude and it is the latest example of AI-generated deepfakes that are used to create uncompromising images of unsuspecting women. The software was first seen through motherboard Samantha Cole, and is available for free download for Windows, with a premium version that offers better resolution output images available for $ 99.
Both the free and premium versions of the app add watermarks to nudes generated by the AI and clearly identify them as "fake". But in the photos & # 39; s made by motherboard, this watermark is easy to remove. (We could not test the app ourselves because the servers are apparently overloaded.)
As we have seen with earlier examples of deepfake pornography, the quality of the output is varied. It is certainly not photo-realistic and with good research the images can easily be recognized as fake. The AI meat is blurry and grainy and the process works best with high resolution images when the target is already revealing as a bathing suit.
But at lower resolutions – or if they are only briefly viewed – the fake images are easily confused with the real thing and can cause untold damage to the lives of individuals.
While much of the discussion about the potential harm of deepfakes is focused on political disinformation and propaganda, the use of this technology to target women has been constant since its inception. Indeed, that was how the tech first distribution, with users on Reddit adjusting AI research published by academics to create fake pornography of celebrities.
A recent report from Huffington Post marked how the target of deepfake pornography and nude photos can improve one's life. As with revenge porn, these images can be used to embarrass, harass, intimidate and silence women. There are forums where men can pay experts to create profane members of colleagues, friends or family members, but with tools such as DeepNude it is easy to make such images private and at the touch of a button.
In particular, the app is unable to take nude photos of men. As reported by motherboardif you give it a picture of a man, it just adds a vulva.
The creator of the DeepNude app, who identified himself as & # 39; Alberto & # 39 ;, told motherboard that he was inspired by memories of old comic ads for & # 39; X-ray specifications & # 39 ;, who promised that they could be used to see people's clothing. "Like everyone else, I was fascinated by the idea that they could really exist and this memory remained," Alberto said.
He says he is a "technology enthusiast" rather than a voyeur, and is motivated by curiosity and enthusiasm for AI, as well as a desire to see if he can get "economic returns" from his experiments.
Alberto says he considered the potential for damage from this software, but finally decided it was not a barrier. "I also said to myself: the technology is ready (within everyone's reach)," Alberto said motherboard. "So if someone has bad intentions, DeepNude doesn't change much … If I don't do it, someone else will do it in a year."
We contacted Alberto to ask further questions and he briefly replied that the app was made for fun and that he did not expect it to become so popular. He compared the software again with Photoshop, and said that this can be used to achieve the same results as DeepNude "after the half-hour tutorial from youtube." He added that if people start using the software for malicious purposes, "we will definitely leave it." (We followed to ask what was considered a bad use case, and how he would know it, but received no response yet. )
A negative aspect that Alberto seems to be concerned about is the potential legal implications, with the DeepNude license agreement claiming that "any photo edited by this software is considered a false parody," and that the app is an "entertainment service" that & # 39; does not promote sexually explicit images & # 39 ;. This is an absurd claim to, given the name of the app, how it is marketed, and, well, its full functionality.
Deepfakes, however, exist in a legally gray area, with lawyers saying they are created by NI could slander, but removing them from the internet would be a possible violation of the First Amendment. An exception to this could be if the technology is used to take nude photos of minors – something that DeepNude seems to be capable of.
Politicians around the world are now starting to take into account the potential damage caused by deepfakes, including in the US. But such legislation will be slow to collect, and the main priority for legislators is to stop the spread of political disinformation. Apps such as DeepNude will almost certainly proliferate and will offer faster and more realistic deepfakes in the coming years.