Home Tech Millions of people are using abusive AI ‘Nudify’ bots on Telegram

Millions of people are using abusive AI ‘Nudify’ bots on Telegram

0 comments
Millions of people are using abusive AI 'Nudify' bots on Telegram

Kate Ruane, director of the free expression project at the Center for Democracy and Technology, says most major tech platforms now have policies prohibiting the non-consensual distribution of intimate images, and many of the largest agree principles to address deepfakes. “I would say that it is actually unclear whether the creation or distribution of non-consensual intimate images is prohibited on the platform,” says Ruane about Telegram Terms of Servicewhich are less detailed than other major technology platforms.

Telegram’s approach to removing harmful content has long been criticized by civil society groups, and the platform has historically hosted scammers, far-right groups and terrorism-related content. Since Telegram CEO and founder Pavel Durov was arrested and charged in France in August in connection with a variety of possible crimes, Telegram has begun making some changes to its terms of service and providing data to law enforcement. . The company did not respond to WIRED’s questions about whether it specifically prohibits explicit deepfakes.

Run damage

Ajder, the researcher who discovered deepfake Telegram bots four years ago, says the app is almost uniquely positioned for deepfake abuse. “Telegram gives you the search function, so it allows you to identify communities, chats and bots,” says Ajder. “It provides bot hosting functionality, so it’s a place that provides the current tools. Then it’s also the place where you can share it and really execute the damage in terms of the end result.”

In late September, several deepfake channels began posting that Telegram had removed their bots. It is unclear what prompted the deportations. On September 30, a channel with 295,000 subscribers posted that Telegram had “banned” its bots, but posted a new bot link for users to use. (The channel was removed after WIRED sent questions to Telegram.)

“One of the really worrying things about apps like Telegram is that it’s very difficult to track and monitor, particularly from the perspective of survivors,” says Elena Michael, co-founder and director of #NotYourPorn, a campaign group working to protect the people. of image-based sexual abuse.

Michael says it has been “notoriously difficult” to discuss security issues with Telegram, but notes that there has been some progress by the company in recent years. However, he says the company should be more proactive in moderating and filtering content.

“Imagine if you were a survivor who had to do that yourself, surely the burden shouldn’t fall on one individual,” Michael says. “Certainly the burden should be on the business to implement something that is proactive rather than reactive.”

You may also like