Activists have reacted with anger as social media company Meta lowered age restrictions for WhatsApp users to 13 in the UK and EU.
The change to the messaging app, which lowers the age limit from 16 to 13, was announced in February and came into effect in the UK and EU on Wednesday.
Campaign group Smartphone Free Childhood said the move by Meta, which also owns Facebook and Instagram, challenged demands from tech companies to do more to protect children.
The group said: “This flies in the face of growing national demand for Big Tech to do more to protect our children.
“Officially allowing anyone over the age of 12 to use your platform (the minimum age was 16 before today) sends a message that it is safe for children.
“But teachers, parents and experts tell a very different story. “As a community, we are tired of tech giants putting the profits of their shareholders before the protection of our children.”
WhatsApp said the change was bringing the age limit in line with most countries and that protections were in place.
The director of online safety strategy at Ofcom, the UK communications regulator, said it “will not hesitate” to fine social media companies that do not follow its instructions, once it has the power to do so.
Mark Bunting told BBC Radio 4’s Today program that the watchdog was drafting codes of practice to enforce online safety. “So when our powers come into force next year, we will be able to hold them to account for the effectiveness of what they are doing,” he said.
“If they are not taking those measures at that time and they cannot show us that they are taking alternative measures that are effective in keeping children safe, then we will be able to investigate.
“We have powers to order them to make changes, if we think they need to be made.
“If they do not comply with those instructions, we have powers to impose fines – and we will not hesitate to use those powers – if there is no other way to drive the change we believe is necessary.”
Meta this week unveiled a range of security features designed to protect users, particularly young people, from “sextortion” and the abuse of intimate images.
It confirmed that it will begin testing a filter in direct messages (DMs) on Instagram, called Nudity Protection, which will be activated by default for those under 18 and will automatically blur images sent to users that are detected to contain nudity .
Upon receiving nude images, users will also see a message urging them not to feel pressured to respond, and an option to block the sender and report the chat.