Home Tech Pedophiles create images of naked children with AI to extort them, charity says

Pedophiles create images of naked children with AI to extort them, charity says

by Elijah
0 comment
Pedophiles create images of naked children with AI to extort them, charity says

Pedophiles are being urged to use artificial intelligence to create images of naked children to blackmail them with more extreme material, according to a child abuse charity.

The Internet Watch Foundation (IWF) said a manual found on the dark web contained a section encouraging criminals to use “nudity” tools to remove clothing from underwear photos sent by a child. The doctored image could then be used against the boy to blackmail him and send him more graphic content, the IWF said.

“This is the first evidence we have seen that perpetrators advise and encourage each other to use AI technology for these purposes,” the IWF said.

technology/2024/apr/23/can-ai-image-generators-be-policed-to-prevent-explicit-deepfakes-of-children"},"ajaxUrl":"https://api.nextgen.guardianapps.co.uk","format":{"display":0,"theme":0,"design":0}}" config="{"renderingTarget":"Web","darkModeAvailable":false}"/>

The charity, which finds and removes child sexual abuse material online, warned last year of a rise in cases of sextortion, where victims are manipulated into sending graphic images of themselves and are then threatened with disclosure of those images unless they give money. It also noted the first examples of AI being used to create “shockingly realistic” abuse content.

The anonymous author of the online manual, which is almost 200 pages long, boasts of having “successfully blackmailed” 13-year-old girls into sending nude images online. The IWF said the document had been handed over to the UK’s National Crime Agency.

Last month, The Guardian revealed that the Labor Party was considering banning nudification tools that allow users to create images of people without clothes.

The IWF has also said that 2023 was “the most extreme year on record.” Its annual report says the organization found more than 275,000 web pages containing child sexual abuse last year, the highest number recorded by the IWF, with a record amount of “category A” material, which can include the most severe images. , including rape, sadism and bestiality. . The IWF said more than 62,000 pages contained category A content, compared to 51,000 the previous year.

The IWF found 2,401 images of self-generated child sexual abuse material (where victims are manipulated or threatened into recording the abuse themselves) taken by children aged between three and six. Analysts said they had seen abuse in domestic settings, including bedrooms and kitchens.

Susie Hargreaves, chief executive of the IWF, said opportunistic criminals trying to manipulate children were “not a distant threat”. She said: “If children under six are targeted in this way, we need to have age-appropriate conversations now to make sure they know how to spot the dangers.”

Hargreaves added that the Online Safety Act, which became law last year and imposes a duty of care on social media companies to protect children, “must work.”

Security Minister Tom Tugendhat said parents should talk to their children about their use of social media. “Platforms that you consider secure can pose a risk,” he said, adding that technology companies should introduce stronger safeguards to prevent abuse.

According to research published last week by communications regulator Ofcom, a quarter of children aged three to four own a mobile phone and half of those under 13 are on social media. The government is preparing to launch a consultation in the coming weeks that will include proposals to ban the sale of smartphones to under-16s and raise the minimum age for social media sites from 13 to 16.

You may also like