Meta announced Tuesday that it is abandoning its third-party fact-checking programs on Facebook, Instagram and Threads and replacing its army of paid moderators with a Community Notes model that mimics X’s much-maligned volunteer program, which allows users to flag publicly the content they create. be incorrect or misleading.
In a blog post Announcing the news, Meta’s newly appointed global affairs director Joel Kaplan said the decision was made to allow more topics to be openly discussed on the company’s platforms. The change will first affect the company’s moderation in the United States.
“We will allow more expression by lifting restrictions on some topics that are part of the dominant discourse and focusing our law enforcement on illegal and high-severity violations,” Kaplan said, although he did not detail what topics these new rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content return to people’s feeds, as well as posts about other topics that have inflamed the culture wars. in the US in recent years.
“We’re going to simplify our content policies and remove a lot of restrictions on topics like immigration and gender that are simply out of touch with mainstream discourse,” Zuckerberg said.
Meta has significantly rolled back the fact-checking and content moderation policies it had implemented in the wake of revelations in 2016 about influence operations conducted on its platforms, which were designed to influence elections and, in some cases, promote violence and even genocide.
Before last year’s high-profile elections around the world, Meta was criticized for taking a hands-off approach to the moderation of content related to those votes.
Echoing comments Mark Zuckerberg made last yearKaplan said Meta’s content moderation policies had been implemented not to protect users but “partly in response to social and political pressure to moderate content.”
Kaplan also criticized fact-checking experts for their “biases and perspectives” that led to excessive moderation: “Over time we ended up with too much fact-checking content that people would understand as legitimate political speech and debate,” Kaplan wrote.
However, WIRED reported last year that dangerous content such as medical misinformation has flourished on the platform, while groups such as anti-government militias have used Facebook to recruit new members.
Meanwhile, Zuckerberg blamed “legacy media” for forcing Facebook to implement content moderation policies after the 2016 election. “After Trump was first elected in 2016, legacy media wrote endlessly about how the “Disinformation was a threat to democracy,” Zuckerberg said. “We tried, in good faith, to address those concerns without becoming arbiters of truth, but fact-checkers have simply been too politically biased and have destroyed more trust than they have created.”
In a bid to eliminate bias, Zuckerberg said Meta’s internal trust and safety team would move from California to Texas, which is now also home to X’s headquarters. “As we work to promote free expression, I think that will help us to build trust to do this work in places where there is less concern about our teams’ biases,” Zuckerberg said.