Apple is introducing a new feature to iMessage in Australia that will allow children to report nude images and videos sent to them directly to the company, which could then report the messages to the police.
The change comes as part of Thursday’s beta releases of new versions of Apple’s operating systems for Australian users. It is an extension of communications security measures that have been enabled by default since iOS 17 for Apple users under 13 years old, but are available to all users. Based on existing safety features, an iPhone automatically detects images and videos containing nudity that children may receive or attempt to send in iMessage, AirDrop, FaceTime, and Photos. Detection occurs on devices to protect privacy.
If a sensitive image is detected, the young user is shown two intervention screens before they can continue and is offered resources or a way to contact a parent or guardian.
With the new feature, when the warning appears, users will also have the option to report the images and videos to Apple.
The device will prepare a report containing the images or videos, as well as the messages sent immediately before and after the image or video. It will include contact information for both accounts and users will be able to fill out a form describing what happened.
The report will be reviewed by Apple, which may take action on an account, such as disabling that user’s ability to send messages via iMessage, and also report the issue to authorities.
Apple said the plan would be to initially roll out the feature in Australia in the latest beta update, but that it would be rolled out globally in the future.
The timing of the announcement and the selection of Australia as the first region to get the new feature coincides with the new codes coming into effect. By the end of 2024, technology companies will be required to monitor terrorism and child abuse content on cloud and messaging services operating in Australia.
Apple had warned that the draft code would not protect end-to-end encryption and would leave the communications of everyone who uses the services vulnerable to mass surveillance. Australia’s eSafety commissioner ultimately watered down the law, allowing companies that believe it would break end-to-end encryption to demonstrate alternative actions to tackle child abuse and terrorist content they would take instead.
Apple has faced heavy criticism from regulators and law enforcement authorities around the world for its reluctance to compromise end-to-end encryption in iMessage for law enforcement purposes. Apple abandoned plans to scan photos and videos stored in its iCloud product for child sexual abuse material (CSAM) in late 2022, prompting more rebukes. Apple, WhatsApp and other encryption advocates say any backdoor to encryption endangers the privacy of users globally.
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of grossly underestimating how often CSAM appears in its products, The Guardian revealed in July.
In 2023, Apple submitted only 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing and Exploited Children (NCMEC), a far lower number compared to other tech giants in the industry; Google reported more than 1.47 million and Meta reporting more than 30.6 million, according to NCMEC’s annual report.