Home Tech YouTube to restrict teens’ exposure to weight and fitness videos

YouTube to restrict teens’ exposure to weight and fitness videos

0 comment
YouTube to restrict teens' exposure to weight and fitness videos

YouTube will stop recommending videos to teenagers that idealise specific fitness levels, body weights or physical features, after experts warned that such content could be harmful if viewed repeatedly.

The platform will still allow 13- to 17-year-olds to watch the videos, but its algorithms won’t push young users down rabbit holes of related content afterward.

YouTube said the content did not violate its guidelines, but that repeated viewing could affect some users’ well-being.

YouTube’s global head of health, Dr. Garth Graham, said: “As a teen develops ideas about who they are and their own standards for themselves, repeated consumption of content that presents idealized standards that begin to shape an unrealistic internal standard could lead some to form negative beliefs about themselves.”

YouTube said experts on its youth and family advisory committee had warned that certain categories that may be “harmless” as a single video could be “problematic” if viewed repeatedly.

The new guidelines, now introduced in the UK and globally, apply to content that: idealises some physical features over others, such as beauty routines to make the nose look slimmer; idealises physical condition or body weight, such as exercise routines that encourage the pursuit of a certain look; or encourages social aggression, such as physical bullying.

YouTube will no longer repeatedly recommend these topics to teens who have registered their age on the platform as registered users. The safety framework has already been implemented in the US.

“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages, and those messages can affect how some teens view themselves,” said Allison Briscoe-Smith, a physician and YouTube advisor. “Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and evaluate how they want to show up in the world.”

Skip newsletter promotion

In the UK, the newly enacted Internet Safety Act requires tech companies to protect children from harmful content, as well as taking into account the potential for their algorithms to expose under-18s to harmful material. The Act addresses the ability of algorithms to cause harm by sending large amounts of content to a child in a short space of time, and requires tech companies to assess any risks such algorithms may pose to children.

You may also like