Home Tech Terrorism watchdog condemns WhatsApp for reducing minimum age for UK users to 13

Terrorism watchdog condemns WhatsApp for reducing minimum age for UK users to 13

0 comment
Head and shoulders photo of Jonathan Hall.

The UK’s terrorism watchdog has criticized Mark Zuckerberg’s Meta for lowering the minimum age for WhatsApp users from 16 to 13, warning the “extraordinary” move could expose more teenagers to extreme content.

Jonathan Hall KC said more children would now be able to access material that Meta cannot regulate, including content related to terrorism or sexual exploitation.

Jonathan Hall said the move was “an extraordinary thing.”

Hall, the independent reviewer of counter-terrorism legislation, told LBC radio that WhatsApp’s use of end-to-end encryption – meaning only the sender and receiver can see messages on the app – left Meta unable to remove hazardous material.

“So by reducing the age of the WhatsApp user from 16 to 13, they are effectively exposing three more years within that age group… to content that they cannot regulate,” he said. “So for me to do that is something extraordinary.”

Hall added that children had become increasingly susceptible to terrorist content, following a record number of arrests last year.

“Last year we arrested 42 children. It is a huge number, the largest ever seen. “It is now clear that children who are particularly susceptible to terrorist content, children who are particularly unhappy… are a round peg in a square hole,” he said. “They are looking for meaning in their lives and they find it. And it could be an extremist identity.”

WhatsApp announced the age change for the UK and EU in February and it came into effect on Wednesday. The platform said the change brought the UK and EU age limit in line with other countries, and that protections were in place.

However, child safety advocates also criticized the decision. The group Smartphone Free Childhood said the move “runs counter to the growing national demand for Big Tech to do more to protect our children.”

Concerns about illegal content on WhatsApp and other messaging platforms made end-to-end encryption a battleground in the Online Safety Act, which empowers communications regulator Ofcom to order a messaging service to use “ accredited technology” to find and remove Child Sexual Abuse Material.

skip past newsletter promotion

The government has attempted to downplay the provision, saying Ofcom could only intervene if scanning content was “technically feasible” and if the process met minimum standards of privacy and accuracy.

In December, Meta announced that it was rolling out end-to-end encryption to its Messenger app, and Instagram was expected to follow suit.

You may also like