SEvey Morton started her first Instagram account when she was 10 years old. She used it to keep up with her friends, but also to follow pop culture trends. Now 16, the San Diego high school student says all the airbrushed perfection and carefully edited selfies from celebrities and influencers made her focus too much on her appearance, leading to anxiety and body image issues.
“Being exposed to that at a very young age influenced the way I grew up,” Morton said. “There’s a big part of me that wishes social media didn’t exist.”
Morton’s struggles inspired her filmmaker mother, Laura, to direct Anxious Nation, a documentary about the so-called anxiety epidemic among American teens. When Morton learned last week that Meta had set new rules for teen accounts, she thought it was a good start, but not a solution.
Instagram owner Meta has implemented changes that give parents the ability to set daily time limits on the app and block teens from using Instagram at night. Parents can also see the accounts their kids are messaging, along with the categories of content they view. Teens’ accounts are now private by default, and Meta said “sensitive content” — which could range from violence to influencers promoting plastic surgery — will be “limited.”
Teens with Instagram accounts will notice that these rules take effect within 60 days. If a child under 16 wants to remove or modify these settings, they need their parents’ permission; 16- and 17-year-olds can change the features without adult help. (A very easy loophole for teens: lying about their age. Meta also saying It is working on better age verification measures to prevent teenagers from circumventing age restrictions.)
“I think these changes are very positive in many ways, especially in restricting sensitive content, but I don’t think they’re a solution,” Morton said. “Especially for teenage girls, if you ask them what their biggest problem with Instagram is, they’ll tell you it’s body image.”
The issue of teen safety has dogged Meta since its beginnings as Facebook, and these new rules come amid renewed backlash from parents and watchdog groups. Instagram has come under fire for failing to protect kids from child predators and for feeding them self-harming content. While testifying at a Senate hearing on online child safety in January, Meta CEO Mark Zuckerberg apologized to parents in the audience who held signs with photos of children who committed suicide or were exploited on the app.
And according to a 2021 Wall Street Journal report investigationInstagram researchers have spent years studying how the app harms young users, especially young girls. An internal slide from a 2019 company meeting read: “We make body image issues worse for one in three teenage girls.” Until recently, company executives like Zuckerberg and Adam Mosseri, Instagram’s chief executive, downplayed these concerns.
The Children’s Internet Safety Act, a bill that passed the Senate this summer, would establish guidelines aimed at protecting minors from harmful content on social media, including disabling “addictive” features on the platforms. A House panel approved the bill last week.
Jim Steyer, founder and CEO of Common Sense Media, an organization that promotes safe technology for children, called the timing of Meta’s announcement “transparent.”
“This is basically another attempt to make a flashy announcement at a time when the company is under political pressure, period,” Steyer said. “Meta has always had these capabilities and the ability to develop new features, and they could have done so to protect young people over the last 10 years. Now that we’re in the midst of a mental health crisis among young people that’s largely been driven by social media platforms like Instagram, they’re acting under pressure from lawmakers and advocates.”
This summer, Vivek H. Murthy, the U.S. surgeon general, called on Congress to issue a warning label on social media, similar to those found on cigarettes or alcohol. Describing the mental health crisis among young people as an “emergency,” Murthy cited the fact that teens who spend more than three hours a day on social media face twice the risk of anxiety and depression symptoms, and that nearly half of all teens say these apps make them feel worse about their bodies.
For Jon-Patrick Allem, an associate professor at the Rutgers School of Public Health who researches the effects of social media on teens, Instagram’s new rules don’t seem radical. line “The New York Times described these rules as a “sweeping overhaul,” he said. “I can’t think of a worse way to describe it. I think instead these are small tweaks to an app that will probably do some good, but not enough.”
Stephen Balkam, founder of the Family Online Safety Institute, is concerned that regulators and researchers won’t see the internal data from Instagram’s new rules for teens. “Without a requirement that[Meta]tell us the data about child safety, I don’t know if this will change things or not,” he said.
A recent Harris Poll A study of 1,006 Gen Z adults (ages 18-27) published in the New York Times found that 34% of respondents wished Instagram had never been invented. Even more people wished the same for TikTok and X: 47% and 50%, respectively.
Morton, 16, says that among her classmates, TikTok and Snapchat are the most popular social media apps, but she still checks Instagram several times a day. “I tend to open the app, refresh my likes feed, close it, and open it again, like, five minutes later,” she said.
Morton added that she would “love” to have a phone with no social media — just her contacts, the iMessage app and the camera. “That would be a dream,” she said. “People ask me, ‘Why can’t you just delete social media?’ But it’s not that easy. It’s where all my friends are. I would miss parties and hangouts. If I deleted it, I guarantee I’d get it back within 24 hours.”