Home Politics Celebrity deepfake porn cases to be investigated by Meta Oversight Board

Celebrity deepfake porn cases to be investigated by Meta Oversight Board

by Alexander
0 comment
Celebrity deepfake porn cases to be investigated by Meta Oversight Board

As AI tools have become increasingly sophisticated and accessible, so has one of their worst applications: non-consensual deepfake porn. While much of this content is hosted on dedicated sites, more and more is finding its way to social platforms. Today, the Meta Oversight Board announced that it was taking up cases that could force the company to consider how it handles deepfake porn.

The board, which is an independent body that can issue binding decisions and recommendations to Meta, will focus on two cases of deepfake pornography, both involving celebrities whose images were altered to create explicit content. In one case involving an anonymous American celebrity, deepfake porn depicting the celebrity was removed from Facebook after it had already been flagged elsewhere on the platform. The post was also added to Meta’s Media Matching Service Bank, an automated system that finds and removes images that have already been flagged as violating Meta’s policies, to keep them off the platform.

In the other case, a deepfake image of an anonymous Indian celebrity remained on Instagram, even after users reported it for violating Meta’s policies on pornography. The Indian celebrity’s deepfake was removed once the board took up the case, according to ad.

In both cases, the images were removed for violating Meta’s policies on harassment and assault, and was not included in Meta’s policies on pornography. Goal, however, prohibits “content that depicts, threatens or promotes sexual violence, sexual assault or sexual exploitation” and does not allow pornography or sexually explicit ads on their platforms. in a blog post Published alongside the announcement of the cases, Meta said it removed the posts for violating the “derogatory sexualized photoshops or drawings” portion of its bullying and harassment policy, and that it also “determined that they violated [Meta’s] policy on adult nudity and sexual activity.”

The board hopes to use these cases to examine Meta’s policies and systems for detecting and removing non-consensual deepfake pornography, according to Oversight Board member Julie Owono. “I can already provisionally say that the main problem is probably detection,” she says. “Detection is not as perfect or at least not as efficient as we would like.”

Meta also has a lot of time. faced criticism for their approach to content moderation outside the US and Western Europe. For this case, the board has already expressed concern that the American celebrity and the Indian celebrity received different treatment in response to the appearance of their deepfakes on the platform.

“We know that Meta is faster and more efficient at moderating content in some markets and languages ​​than others. By taking one case from the United States and one from India, we want to see if Meta is protecting all women globally fairly,” says Oversight Board Co-Chair Helle Thorning-Schmidt. “It is critical that this matter be addressed and the board looks forward to exploring whether Meta’s compliance policies and practices are effective in addressing this issue.”

You may also like