A Los Angeles judge has refused to dismiss a series of blockbuster lawsuits against Meta, TikTok, Snap and Google, arguing that their platforms are deliberately designed to addictive and fuel mental disorders in teens, making them more likely that they may have to deal with or settle for billions. of dollars the product liability claims.
In the first trial to raise a new theory of public nuisance among hundreds of public officials and parents of minors, Los Angeles Superior Court Judge Carolyn Kuhl ruled Friday that the companies cannot violate Section 230 — Big Tech’s favored legal shield use to escape. some claims in the case. She nodded to “the fact that the design features of the platforms – and not the specific content users view” caused their injuries.
Thousands of plaintiffs across the country have sued social media companies, arguing that their platforms are essentially flawed products that lead to eating disorders, anxiety and suicide, among other mental health issues. The lawsuits could lead to billions of dollars in payouts, with similar public nuisance suits by government officials in lawsuits against opioid and tobacco companies leading to massive settlements. By refraining from claims that target the specific content that companies host, they seek to avoid the potential immunity provided by Section 230, which has historically provided tech companies with significant legal protection from liability as third-party publishers.
Kuhl asserted a negligence claim, finding that defendants cannot rely on the law to dismiss allegations relating to allegedly defective design features as they do not relate to third-party content. She pointed last year to a federal appeals court that undermined the application of Section 230 by concluding that Snap could be liable for a lawsuit alleging that the company’s design of a speedometer feature contributed to a fatal speeding crash on to encourage.
“The features themselves would addict and harm minor users of the platforms, regardless of the specific third-party content viewed by the minor user,” said the ruling, which criticized TikTok’s continuous scrolling feature and inability to disable autoplay cited.
Other potentially problematic product features include lenses and filters, which have reportedly promoted body image issues among teens, and the lack of parental controls supposedly designed to encourage minors to create secret accounts to mask their use. Kuhl said Section 230 “does not provide immunity” when a “provider manipulates third-party content in a way that harms a user.”
Under the order, a jury will decide whether users’ addiction to the platforms was caused by third-party content or the apps’ design features.
Additionally, Kuhl declined to dismiss allegations that Meta may have fraudulently hidden internal research showing the negative impact Instagram can have on the mental health of minors — including data showing that “high-time users” are disproportionately young and reports that teenagers look to Instagram as a source. of increased anxiety and depression. Parents argued that they would not have let their children use the platform if they had known.
“Meta is not shielded from tort liability for its own failure to warn because these adverse effects that allegedly should have been disclosed resulted from Meta’s own conduct, and not from any specific content displayed,” Kuhl wrote , who noted that Meta may have had a duty to warn of potential harm as the creator of features designed to maximize the involvement of minors.
At a loss to the plaintiffs, product liability claims were dismissed because such claims are typically reserved for “tangible products” that are mass manufactured and marketed.
According to the judge’s ruling, claims for strict liability, negligence in design and negligent undertaking were dismissed, among other things. Plaintiffs were given the opportunity to correct the allegations.
In a statement, a Google spokesperson said the “allegations in these complaints are simply not true.” He added: “Protecting children on our platforms has always been at the heart of our work. Working with child development specialists, we’ve developed customized experiences for kids and families on YouTube for Kids and Families, and provide robust controls for parents.”
Snap declined to comment. Meta and Snap did not respond to requests for comment.
The plaintiffs’ attorney, Brian Panish, said in a statement that “this decision is an important step forward for the thousands of families we represent whose children suffer permanent debilitating mental health issues thanks to these social media giants.” He emphasized that “tech companies like Meta, Snap Inc., ByteDance and Google are extremely powerful and have little industry-specific regulation to keep them in check.”