Starting Nov. 2, YouTube will impose restrictions on how often teens receive repeated recommendations for videos related to sensitive topics like body image, the company announced Thursday.
YouTube says the new safeguards are the result of its partnership with the Youth and Families Advisory Committee, which is made up of psychologists, researchers and other experts in child development, children’s media and digital learning. For years, the committee has advised YouTube about the potentially harmful mental health effects that repeated exposure to certain online content can have on teenagers.
“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages, and those messages can affect the way some teens view themselves,” explains Allison Briscoe-Smith, physician, researcher and member of the Committee. Youth and Family Advisor, in a press release. “Barriers can help teens maintain healthy patterns as they naturally compare themselves to others and evaluate how they want to appear in the world.”
YouTube worked with the advisory committee to identify categories of videos that could pose a problem if viewed repeatedly. Now, teen viewers will no longer receive repeated video recommendations for content that “compares physical characteristics and idealizes some types over others, idealizes specific fitness levels or body weights, or displays social aggression in the form of non-contact fighting and bullying.”
YouTube also announced other product updates related to teen wellness, including more frequent and noticeable updates. “take a break” and bedtime reminders. YouTube has also become its crisis resource panel, which connects users searching for queries like “eating disorders” with live support from crisis service partners, in a full-page experience. Dashboards will now also feature more visually prominent resources for third-party crisis hotlines, while attempting to redirect search queries with topic suggestions such as “self-compassion” or “grounding exercises.”
Additionally, YouTube says it is working with the World Health Organization (WHO) and Common Sense Networks to develop educational resources for parents and teens. It will include guidance on how to create online videos safely and empathetically, as well as how to respond to comments and more.
Meta isn’t the only social network that has gotten into legal trouble this year, either. In June, a Maryland school district sued Meta and Google, Snap and TikTok owner ByteDance for allegedly contributing to a “mental health crisis” among students.
“Over the past decade, defendants have relentlessly pursued a growth-at-all-costs strategy, recklessly ignoring the impact of their products on the physical and mental health of children,” the lawsuit states. “In a race to corner the ‘valuable but untapped’ market of preteen and teen users, each defendant designed product features to promote repetitive and uncontrollable use by children.”
YouTube will begin limiting repeat video recommendations to teens in the US starting November 2 before expanding to other countries in 2024.