Home Tech GitHub’s deepfake porn crackdown still not working

GitHub’s deepfake porn crackdown still not working

0 comments
GitHub's deepfake porn crackdown still not working

“When we look at intimate image abuse, the vast majority of the tools and weaponization come from the open source space,” Ajder says. But they often start with well-intentioned developers, he says. “Someone creates something they think is interesting or cool, and someone with bad intentions recognizes its malicious potential and uses it as a weapon.”

Some, like the repository deactivated in August, have specially designed communities around them for explicit uses. The model was positioned as a tool for deepfake porn, Ajder claims, becoming a “funnel” for abuse, which primarily targets women.

Other videos uploaded to the porn streaming site by an account crediting AI models downloaded from GitHub featured the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift and Anya Taylor-Joy, as well as other less famous but very real. , superimposed on sexual situations.

The creators freely described the tools they used, including two removed by GitHub but whose code survives in other existing repositories.

Perpetrators lurking for deepfakes congregate in many places online, including community forums covert on Discord and in plain sight on Reddit, compounding deepfake prevention efforts. A Redditor offered his services using the software from the archived repository on September 29. “Could someone do it with my cousin?” asked another.

Torrents from the main repository banned by GitHub in August are also available in other corners of the web, showing how difficult it is to control open source deepfake software across the board. Other deepfake porn tools, such as the DeepNude app, have been equally knocked down before new versions appeared.

“There are so many models, so many different forks in the models, so many different versions, that it can be difficult to track them all,” says Elizabeth Seger, director of digital policy at the British multi-party think tank Demos. “Once a model is made publicly available for download, there is no way to publicly revert it,” he adds.

A deepfake porn creator with 13 manipulated explicit videos of female celebrities has credited a prominent GitHub repository marketed as a “NSFW” version of another project that encourages responsible use and explicitly asks users not to use it for nudity. “Learn all the Face Swap AI available on GitHUB, without using online services,” his profile on the tube site cheekily says.

GitHub had already disabled this NSFW version when WIRED identified the deepfake videos. But other repositories rated as “unlocked” versions of the model became available on the platform on January 10, including one with 2,500 “stars.”

“It is technically true that once (a model) exists, it cannot be reversed. But we can still make it difficult for people to access,” says Seger.

If left unchecked, he adds, the potential for harm from deepfake “porn” is not just psychological. Its side effects include the intimidation and manipulation of women, minorities and politicians, as seen with political falsifications affecting women politicians worldwide.

But it’s not too late to get the problem under control, and platforms like GitHub have options, Seger says, including intervention at the upload point. “If you put a model on GitHub and GitHub says no, and all the hosting platforms say no, it becomes harder for a normal person to get that model.”

Controlling deepfake porn created with open source models also depends on policymakers, technology companies, developers and, of course, the creators of abusive content themselves.

At least 30 US states also have some legislation addressing deepfake porn, including bans, according to the nonprofit Public Citizen’s. legislation trackeralthough the definitions and policies are disparate and some laws cover only minors. Deepfake creators in the UK will also soon feel the force of the law after the government’s announcement. criminalize the creation of sexually explicit deepfakesas well as their exchange, on January 7.

You may also like