Less than a day after receiving wide attention, the deepfake app that AI used to take fake nude photos of women is closed. In a tweet, the team behind DeepNude said they strongly underestimate the interest in the & # 39; project & # 39; and that & # 39; the chance that people abuse it is too big & # 39 ;.
DeepNude is no longer offered for sale and further versions will not be released. The team also warned against online sharing of the software and said it would violate the app's terms of service. They acknowledge that "certainly some copies" will come out.
motherboard first attracted attention until DeepNude yesterday afternoon. The app, available for Windows and Linux, used AI to change photos to make a person look naked and was only designed to work on women. It was on sale for a few months and the DeepNude team says that "frankly, the app isn't that great" in what it does.
But it still worked well enough to raise general concerns about its use. Although people have long been able to digitally manipulate photos, DeepNude made that option immediately available and available to everyone. These photos can then be used to harass women: deepfake software is already used to edit women in porn videos & # 39; s without their permission, with little they can then do to protect themselves if those videos are distributed.
The creator of the app, who just goes through Alberto, told The edge earlier today that he believed that someone else would soon make an app like DeepNude if he didn't do it first. "The technology is ready (within everyone's reach)," he said. Alberto said the DeepNude team will "be sure" when they see the app being abused.
DeepNude & # 39; s team ends their message by announcing the shutdown by saying "the world is not ready for DeepNude", as if there will be some time in the future when the software can be used properly. But deepfakes will only become easier and harder to detect, and in the end it is not a problem to know if something is real or fake. With these apps, images of people can be abused quickly and easily. For the time being there are few or no safeguards to prevent this from happening.