In the three days following the terrorist attacks carried out by Hamas against Israel on October 7, Meta says it removed “seven times more content daily” for violating its Dangerous Organizations and Individuals policy in Hebrew and Arabic versus two months before . The revelation came as part of a blog post in which the social media company outlined its moderation efforts during the ongoing war in Israel.
Although it does not mention the EU or its Digital Services Law, Meta’s blog post was published days after European Commissioner Thierry Breton wrote an open letter to Meta reminding the company of its obligations to limit disinformation and illegal content on their platforms. Breton wrote that the Commission is “seeing an increase in illegal content and disinformation being spread in the EU through certain platforms” and “urgently” called on Meta CEO Mark Zuckerberg to “ensure that its systems are effective.” The commissioner has also written similar letters to X, the company formerly known as Twitter, as well as tiktok.
Almost 800,000 content “deleted or marked as disturbing”
Meta says it “removed or marked as disturbing” more than 795,000 pieces of content in the three days after October 7 for violating its Hebrew and Arabic policies and says Hamas is banned from its platforms. The company also says it is taking more temporary measures such as blocking hashtags and prioritizing Facebook and Instagram Live reporting related to the crisis. The company says it also allows deletion of content without disabling accounts because the higher volume of content being deleted means some may be deleted by mistake.
Even more recently, Meta’s record on moderation hasn’t been perfect. Members of its Trusted Partner program, which is supposed to allow expert organizations to raise concerns about Facebook and Instagram content to the company, have complained about slow responses and it has faced criticism for changing moderation policies around the war between Russia and Ukraine.
X’s outline of his moderation around the conflict does not mention the languages spoken by his response team. Since then, the European Commission formally sent X a request for information under its Digital Services Law, citing the “alleged dissemination of illegal content and disinformation,” including “the dissemination of terrorist and violent content and hate speech.”