Instagram, owned by Facebook, is introducing changes designed to make the app safer for young people. Going forward, anyone who signs up for the service and is under 16 years old (or under 18 in certain countries) will have their account set to private by default, although the option to switch to public will still be available. Anyone under these ages who now has a public account will receive a notification encouraging them to switch to private.
Instagram has been working on making private accounts standard for young people for a while now. In March, it started showing young people who signed up to Instagram a post praising the virtues of having a private account. Now it makes private the default.
Facebook is also introducing changes to how advertisers can target users under the age of 18. Previously, each user could be targeted based on their interests and activity; information Facebook collects from across the web, not just its own properties, analyzes individuals’ browsing history, app usage, and the like. Advertisers can now only target users under the age of 18 based on their age, gender, and location. This applies to users on Instagram, Messenger and Facebook.
On Instagram, the company also says it is doing more to limit the interaction of problematic users with users under the age of 16. The company says it is able to identify “potentially suspicious behavior” from accounts. This means that, for example, the account was recently blocked or reported by a younger person. These suspicious users are virtually separated from users under 16: they will not appear under the 16 accounts on their Explore, Reels, or Accounts Suggested For You pages, nor will they see comments from users under 16 on other people’s posts or are displayed. be able to comment on content from users under the age of 16.
“We’re trying to find out if an adult is exhibiting suspicious behavior,” said Karina Newton, Instagram’s head of government policy. told NBC News. “The adult may not have broken the rules yet, but may be doing things that make us look at it more deeply.”
Instagram previously used the ability to identify suspicious accounts to alert teens when they received a direct message from one of these users. It also blocked adults from messaging teens who don’t already follow them.
At the same time that Facebook is trying to make Instagram safer and more private for teens, it’s still developing an app for kids under 13 (the current minimum age to sign up for Instagram). The plans were first reported by BuzzFeed News in March, and received a lot of criticism and complaints.
Instagram’s Newton told NBC News that Instagram’s under-13 app is still being worked on and that the company was in “deep consultation with child development experts and privacy advocates” to meet the “needs of families.” and young people”.
Newton said, “We want to build something that appeals to tweens and works for parents.”