New usernames aren’t the only change to the popular chat app Discord, which is now used by 150 million people every month. The company is also testing a suite of parental controls that will allow greater supervision of Discord’s youngest users, TechCrunch has learned and Discord confirmed. In a live test being conducted on Discord’s iOS app in the US, the company introduced a new “Family Center” feature, where parents can configure tools that allow them to see the names and avatars of their recently added friends. teen, the servers the teen has joined or participated in, and the names and avatars of users they’ve directly messaged or interacted with in group chats.
However, Discord clarifies in an informational screen that parents cannot view the content of their teen’s messages or calls to respect their privacy.
This approach, which strikes a fine line between the need for parental controls and a minor’s right to privacy, is similar to how Snapchat implemented parental controls in its app last year. Like Discord’s system, Snapchat only gives parents insight into who their teen is talking to and befriending, not what they’ve typed or the media they’ve shared.
Users participating in the Discord test will see the new Family Center hub linked under the User Settings section of the app, under the Privacy & Security and Profiles sections. From here, parents can read an overview of the Family Center’s features and click a button to get started when they’re ready to set things up.
Image Credits: Discord screenshot via Watchful.ai
Discord explains on this screen that it “built Family Center to bring you more content about how your teen is using Discord so you can work together to build positive online behavior.” It then describes the different parental controls that let them see who their teen is chatting with and making friends with, and what servers they are logging into and participating in.
Similar to TikTok, parents can scan a QR code provided by the teen to place the account under their supervision.

Image Credits: Discord screenshot via Watchful.ai
The screenshots were discovered by an app intelligence agency Watchful.ai. In addition, a handful of users had posted their own screenshots on Twitter when they came across, or just had, the new experience earlier this year noticed about the feature when you come across it in the app.
We reached out to Discord for comment on the tests and showed them some screenshots of the test. The company confirmed the development, but offered no firm commitment on when or if the parental controls feature would actually be rolled out.
“We are always working to improve our platform and keep users safe, especially our younger users,” said a Discord spokesperson. “We’ll let you know if and when anything comes of this work,” they added.
Among other things, the company refused to answer our questions about the scope of the tests or plan to offer the tools outside the US.

Image Credits: Discord screenshot via Watchful.ai
While Discord is regularly used by a younger, Gen Z audience these days, thanks to its roots in being a home for gamers, it’s often left out of the larger conversation about the damage to teens caused by social media use. Meanwhile, as executives from Facebook, Twitter, Instagram, Snap, YouTube and TikTok have had to testify before Congress on the issue, Discord has been able to sit on the sidelines.
Hoping to get ahead of expected regulations, most major social media companies have since rolled out parental control features for their apps, if they didn’t already offer such tools. YouTube and Instagram announced parental controls plans in 2021, and Instagram eventually launched them in 2022 and other Meta apps would follow. Snapchat also rolled out parental controls in 2022. And TikTok, which already had parental controls before the congressional investigation began, has been beefing them up in recent months.
But due to the lack of regulation at the federal level, several US states have started their own laws around social media use, including new restrictions on social media apps in states such as Utah and Montana, as well as broader legislation to protect minors, such as California Age Appropriate Design Code Actwhich enters effect next year.
Discord has so far flown under the radar despite warnings from child safety experts, Police and the media about the dangers that the app poses to minors, amid reports That groomers And sexual predators have been use of the service targeting children. The non-profit organization, the National Center on Sexual Exploitation even added Discord to its “Dirty Dozen List” for failing to “adequately address the sexually exploitative content and activities on its platform,” it says.
The organization specifically cites Discord’s lack of meaningful age verification technology, inadequate moderation, and inadequate security settings.
Today, Discord offers its users access to an online safety center which guides users and parents on how to manage a secure Discord account or server, but doesn’t go so far as to actually provide parents with tools to monitor their child’s use of the service or prevent them from joining servers or communicate with unknown persons. The new parental controls won’t address the last two concerns, though, but they’re at least an acknowledgment that some sort of parental controls are needed.
This is a shift from Discord’s previous stance on the matter, as a company told The Wall Street Journal in early 2021 the philosophy was to put users first, not their parents, and said it had no plans to add such a feature.