Home Tech Tech companies must ‘tame’ algorithms under Ofcom child safety standards

Tech companies must ‘tame’ algorithms under Ofcom child safety standards

0 comment
Tech companies must 'tame' algorithms under Ofcom child safety standards

Social media companies have been told to “tame aggressive algorithms” that recommend content harmful to children, as part of Ofcom’s new safety codes of practice.

Child safety codes, introduced as part of the Online Safety Act, allowed Ofcom to set tough new rules for internet companies and how they can interact with children. Ask services to make their platforms safe for children by default or implement strong age controls to identify children and provide them with safer versions of the experience.

For those sites with age controls, Ofcom will require algorithmic curation to be modified to limit risks to younger users. That would require sites like Instagram and TikTok to ensure that suggested posts and “for you” pages explicitly take children’s ages into account.

They will also have to make extra efforts to crack down on the spread of harmful content, such as “violent, hateful or abusive material, online harassment and content that promotes dangerous challenges.”

The most seriously harmful content, including that related to suicide, self-harm and eating disorders, will need to be kept completely out of children’s feeds, as will pornography.

Enforcing the new requirements will pose a challenge. Algorithmic curation is often described as a “black box,” with some companies unsure how their own systems decide what content to promote and suppress. But Ofcom is confident its enforcement will be effective, says Gill Whitehead, the regulator’s online safety lead.

“We’ve spoken to 15,000 children over the last two years to date, and they tell us the types of harmful content they see, how it appears and how often they see it. And we also have very strong data collection powers to request that data and require technology companies to provide it to us.

“The big change is that the very harmful content that (children) see must be filtered so that they do not see it. And then harmful content, like violent or harmful substances, or dangerous challenges or stunts, needs to be lowered in the rankings, so you see it much less frequently. So those kinds of powerful combinations of volume and intensity are not going to be as prolific and harmful to children as they are today.”

The draft code is open for consultation until July 17, before it is finalized and presented to parliament, and the services will have three months to carry out their own child risk assessments, which must be completed before implementation begins.

Ofcom chief executive Dame Melanie Dawes said: “We want children to enjoy life online. But for too long, their experiences have been ruined by seriously harmful content that they cannot avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.

“In line with new online safety laws, our proposed codes firmly place the onus on tech companies to keep children safer. They will have to tame aggressive algorithms that send harmful content to children in their personalized feeds and introduce age controls so that children get an age-appropriate experience.

“Our measures, which go far beyond current industry standards, will be a step-change in online safety for children in the UK. Once they come into force, we will not hesitate to use our full range of enforcement powers to hold platforms to account. That is a promise we make to children and parents today.”

UK technology secretary Michelle Donelan said: “The government has tasked Ofcom with enforcing the law and today the regulator has been clear; Platforms must introduce the types of age controls that young people experience in the real world and address algorithms that easily mean they encounter harmful material online.

“Once implemented, these measures will bring a fundamental change to the way UK children experience the online world.”

Child online safety campaigner Ian Russell, father of 14-year-old Molly Russell, who took her own life in November 2017 after viewing harmful material on social media, said more still needs to be done to protect children. harm young people online.

In his role as chair of online safety charity Molly Rose Foundation, Russell said: “Ofcom’s task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm.

“The regulator has proposed some important and welcome measures, but its overall set of proposals must be more ambitious to prevent children encountering harmful content that cost Molly her life.”

You may also like