Lawmakers want to take legal protection out of the Facebook news feed

Democratic lawmakers want social networks to be held legally accountable for recommending harmful content to users. Representatives Anna Eshoo (D-CA), Frank Pallone Jr. (D-NJ), Mike Doyle (D-PA), and Jan Schakowsky (D-IL) introduced the “Justice Against Malicious Algorithms Act,” which would amend Section 230’s protections to exclude “personalized recommendations” for content that contributes ​​to physical or serious emotional harm.

The bill follows a recommendation from Facebook whistleblower Frances Haugen made for the congress last week. Haugen, a former employee who leaked extensive internal Facebook investigation, encouraged lawmakers to crack down on algorithms that promote, rank, or otherwise order content based on user engagement. It applies to web services with more than 5 million monthly visitors and excludes certain categories of material, including infrastructure services such as web hosting and search results return systems.

For covered platforms, the bill targets section 230 of the Communications Decency Act, which prevents people from suing web services over third-party content that users post. The new exception would allow these cases to continue if the services knowingly or recklessly used a “personalized algorithm” to recommend the content of third parties in question. This can be messages, groups, accounts and other information provided by the user.

The bill wouldn’t necessarily let people sue over the kind of material Haugen criticized, including hate speech and anorexia-related content. Much of that material is legal in the United States, so platforms don’t need an additional liability shield to host it. (A statement from Pallone) also criticized sites for promoting “extremism” and “disinformation,” which are also not necessarily illegal.) The bill also only covers personalized recommendations, defined as sorting content with an algorithm that “relies on information specific to an individual”. Businesses could apparently still use large-scale analytics to recommend the most popular general content.

In her testimony, Haugen suggested the goal was to add general legal risks until Facebook and similar companies stopped using personalized recommendations altogether. “If We Reform” [Section] 230 to hold Facebook accountable for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking,” she said.