HomeTech EU investigates Facebook owner Meta over child safety, mental health concerns

EU investigates Facebook owner Meta over child safety, mental health concerns

0 comment
EU investigates Facebook owner Meta over child safety, mental health concerns

The European Commission has opened an investigation into the owner of Facebook and Instagram over concerns that the platforms are creating addictive behavior among children and harming mental health.

The EU executive said Meta may have violated the Digital Services Act (DSA), a landmark law passed by the bloc last summer that holds digital companies large and small accountable for disinformation, purchase scams, child abuse and others. online damage.

“Today we opened a formal procedure against Meta,” EU Commissioner for the Internal Market Thierry Breton said in a statement. “We are not convinced that it has done enough to meet the DSA’s obligations to mitigate the risks of negative physical and mental health effects on young Europeans on its Facebook and Instagram platforms.”

The research will explore the potential addictive impacts of the platforms, known as “rabbit hole” effects, where an algorithm feeds young people negative content, such as unrealistic body images. It will also analyze the effectiveness of Meta’s age verification tools and the privacy of minors. “We spare no effort to protect our children,” Breton said.

A Meta spokesperson said: “We want young people to have safe, age-appropriate online experiences and have spent a decade developing more than 50 tools and policies designed to protect them. “This is a challenge facing the entire industry and we look forward to sharing details of our work with the European Commission.”

Last month, the commission opened an investigation into Meta under the DSA over its handling of political content amid concerns that it was not doing enough to counter Russian disinformation ahead of the EU elections in June.

According to the DSA, platforms are obliged to protect the privacy and safety of children. Following a preliminary investigation, EU officials are concerned that Facebook and Instagram “may exploit the weaknesses and inexperience of minors and lead to addictive behavior.”

They are also skeptical about the platform’s age verification tools. Users must be at least 13 years old to open an account on Facebook or Instagram.

One official said it was “so obviously easy to circumvent some controls” that the commission wanted to know how Meta had assessed that these measures could be effective and appropriate.

The commission also launched two investigations into TikTok, which led the Chinese-owned video-sharing platform to voluntarily withdraw a service in France and Spain last month. This followed the initiation of DSA proceedings against X for alleged hate speech and against online commerce site AliExpress for its advertising transparency and complaints handling.

skip past newsletter promotion

The DSA, which came into effect in February for platforms operating in Europe, was intended to force powerful online platforms that were “too big to care” to take responsibility for online security.

If the commission is not satisfied with Meta’s response, it can impose a fine equivalent to 6% of its global turnover. More immediately, you can conduct on-site investigations and interview company executives, without publicly setting a deadline for completing the investigation.

You may also like