Home Tech Child sexual abuse content grows online with AI-created images, report says

Child sexual abuse content grows online with AI-created images, report says

0 comment
Child sexual abuse content grows online with AI-created images, report says

Child sexual exploitation is increasing online and taking new forms, such as images and videos generated by artificial intelligence, according to an annual assessment released Tuesday by the National Center for Missing and Exploited Children (NCMEC), a clearinghouse based in in the United States for reporting child sexual abuse material.

Reports to NCMEC about online child abuse increased more than 12% in 2023 compared to the previous year, surpassing 36.2 million reports, the organization said in its annual CyberTipline report. The majority of reports received related to the circulation of child sexual abuse material (CSAM), such as photos and videos, but there was also an increase in reports of financial sexual extortion, when an online predator lures a child into send you nude images or videos. and then demands money.

According to NCMEC, some children and families were extorted for financial gain through the use of AI-manufactured CSAM.

The center received 4,700 reports of images or videos of sexual exploitation of children made by generative AI, a category it only began tracking in 2023, a spokesperson said.

“NCMEC is deeply concerned about this rapidly growing trend, as bad actors may use artificial intelligence to create falsified sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children. engaging in graphic sexual acts.” states the NCMEC report.

“For the children who appear in deepfakes and their families, it is devastating.”

According to the organization, AI-generated child abuse content also prevents the identification of real child victims.

Creating such material is illegal in the United States, as making visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachusetts-based Justice Department prosecutor, who spoke on condition of anonymity.

In total, in 2023, CyberTipline received more than 35.9 million reports referring to suspected CSAM incidents, more than 90% of them uploaded outside the US. According to Tuesday’s report, approximately 1 .1 million reports to police in the United States, and 63,892 reports were urgent or involved a child in imminent danger.

There were 186,000 reports of online incitement, up 300% from 2022; Seduction is a form of exploitation that involves an individual communicating online with someone believed to be a child with the intent to commit a sexual crime or kidnapping.

The platform that presented the most cyber advice was Facebook, with 17,838,422. Meta’s Instagram made 11,430,007 reports and its WhatsApp messaging service generated 1,389,618. Google sent NCMEC 1,470,958 tips, Snapchat sent 713,055, TikTok sent 590,376, and Twitter reported 597,087.

skip past newsletter promotion

In total, 245 companies submitted CyberTipline reports to NCMEC out of 1,600 companies around the world that registered their participation in the Cybertip reporting program. US-based Internet service providers, such as social media platforms, are legally mandated to report CSAM cases to CyberTipline when they become aware of them.

According to NCMEC, there is a disconnect between reporting volumes and the quality of reports submitted. The center and authorities cannot take legal action in response to some of the reports, including those made through content moderation algorithms, without human involvement. This technicality can prevent police from seeing reports of possible child abuse.

“The relatively low number of reporting companies and the poor quality of many reports mark the continued need for action by Congress and the global technology community,” the NCMEC report states.

In the US, call or text child help abuse hotline at 800-422-4453 or visit their website for more resources and report child abuse or DM for help. For adult survivors of child abuse, help is available at ascasupport.org. In the United Kingdom, the NSPCC offers support to children on 0800 1111 and adults concerned about a child on 0808 800 5000. The National Association of People Abused as Children (Napac) offers support to adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the children’s helpline on 1800 55 1800, or brave hearts at 1800 272 831, and adult survivors can contact Blue Knot Foundation at 1300 657 380. Other sources of help can be found at International Children’s Helplines

You may also like