Last year, WIRED reported that deepfake pornography is only increasing, with researchers estimating that 90 percent Twenty percent of deepfake videos are pornography, the vast majority of which is nonconsensual pornography of women. But as widespread as the problem is, Kaylee Williams, a researcher at Columbia University who has been tracking legislation on nonconsensual deepfakes, says she has seen lawmakers most focused on political deepfakes.
“There are more states interested in protecting election integrity in that way than in addressing the issue of intimate images,” he says.
Matthew Bierlein, a Republican state representative from Michigan who co-sponsored the state’s package of nonconsensual deepfake bills, says he initially came up with the topic after exploring political deepfake legislation. “Our plan was to make[political deepfakes]a campaign finance violation if you didn’t put a warning on them to notify the public.” Through his work on political deepfakes, Bierlein says, he began working with Democratic Rep. Penelope Tsernoglou, who helped push the nonconsensual deepfake bills.
By January, non-consensual Taylor Swift deepfakes had gone viral and the issue had received widespread news coverage. “We thought it was the right time to be able to do something,” Beirlein says. And Beirlein says she felt Michigan was poised to be a regional leader in the Midwest, because unlike some of its neighbors, it has a full-time, well-paid legislature (Most states don’t.). “We understand that it’s a bigger problem than a Michigan problem, but a lot of things can start at the state level,” he says. “If we can get this to happen, then maybe Ohio will adopt this in their legislative session, maybe Indiana will adopt something similar, or Illinois, and that can make enforcement easier.”
But the penalties for creating and sharing nonconsensual deepfakes — and who is protected — can vary widely from state to state. “The U.S. landscape is wildly inconsistent on this issue,” Williams says. “I think there’s been a misconception lately that all these laws are being passed across the country. I think what people are seeing is that there’s been a lot of laws proposed.”
Some states allow both civil and criminal cases to be brought against perpetrators, while others only allow for one or the other. Laws such as the which recently came into force In Mississippi, for example, they focus on minors. Over the last year or so, there have been a series of cases of middle and high school students using generative artificial intelligence to create explicit images and videos of classmates, particularly girls. Other laws focus on adults, with lawmakers essentially updating existing laws banning revenge porn.
Unlike laws focusing on nonconsensual deepfakes of minors, which Williams says there is broad consensus are an “inherent moral evil,” legislation around what is “ethical” when it comes to nonconsensual deepfakes of adults is “softer.” In many cases, laws and bills require proving intent — that is, that the goal of the person making and sharing the nonconsensual deepfake was to harm its subject.