Home Tech More than 140 Kenyan Facebook moderators diagnosed with severe PTSD

More than 140 Kenyan Facebook moderators diagnosed with severe PTSD

0 comments
More than 140 Kenyan Facebook moderators diagnosed with severe PTSD

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content, including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight to 10 hours a day at a facility in Kenya for a company contracted by the social media company, and Dr. Ian Kanyanya found they suffered from post-traumatic stress disorder, generalized anxiety disorder (GAD), and major depressive disorder. (TDM). the head of mental health services at Kenyatta National Hospital in Nairobi.

The mass diagnostics were conducted as part of a lawsuit filed against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

Images and videos that included necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and flee their desks, the documents allege.

The case is shedding light on the human cost of the rise in social media use in recent years, which has increasingly required restraint, often in some of the poorest parts of the world, to protect users from the worst. material that some people publish.

At least 40 of the moderators in the case abused alcohol, drugs such as cannabis, cocaine and amphetamines, and medications such as sleeping pills. Some reported marriage breakdown and a collapse in desire for sexual intimacy, and a loss of connection to their families. Some whose job it was to remove videos uploaded by terrorist and rebel groups feared that they would be watched and attacked, and that if they returned home they would be hunted down and killed.

Facebook and other big social media and AI companies rely on armies of content moderators to remove posts that violate their community standards and train AI systems to do the same.

Moderators in Kenya and other African countries were tasked between 2019 and 2023 with verifying posts coming from Africa and in their own languages, but they were paid eight times less than their counterparts in the US, according to the claim documents.

Medical reports submitted to Nairobi’s employment and industrial relations tribunal and seen by The Guardian paint a horrifying picture of working life inside facilities contracted by Meta, where workers were given a constant stream of images to check in a cold space. similar to a warehouse. under bright lights and with their work activity monitored up to the minute.

Nearly 190 moderators are filing a multi-pronged claim that includes allegations of intentional infliction of mental harm, unfair labor practices, human trafficking and modern slavery, and illegal dismissals. The 144 examined by Kanyanya were found to have PTSD, GAD and MDD with severe or extremely severe PTSD symptoms in 81% of cases, mostly at least a year after their departure.

Meta and Samasource declined to comment on the claims due to the litigation.

Martha Dark, founder and co-chief executive of Foxglove, a UK-based nonprofit that backed the court case, said: “The evidence is indisputable: moderating Facebook is a dangerous job that inflicts post-traumatic stress disorder for every reason.” life to almost all those who moderate.” he.

“In Kenya, it traumatized 100% of the hundreds of former moderators tested for PTSD… In any other industry, if we discovered that 100% of security workers were being diagnosed with an illness caused by their job, the people responsible would be forced to resign and face the legal consequences of the massive violations of people’s rights. “That is why Foxglove supports these brave workers to seek justice in the courts.”

According to documents filed in the Nairobi case, Kanyanya concluded that the primary cause of the mental health conditions among the 144 people was their work as Facebook content moderators, as they “encountered extremely graphic content on a daily basis, including videos of gruesome murders, self-harm, suicides, suicide attempts, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions, just to name a few.”

Four of the moderators suffered from trypophobia, an aversion or fear of repetitive patterns of small holes or bumps that can cause intense anxiety. For some, the condition developed from seeing holes in decomposing bodies while working on Facebook content.

Moderation and the related task of tagging content are often hidden parts of the tech boom. Similar, but less traumatic, arrangements are made for contract workers to tag masses of images of mundane things like street furniture, living rooms, and road scenes so that artificial intelligence systems designed in California know what they are looking at.

Meta said he took support from content reviewers seriously. Contracts with third-party content moderators on Facebook and Instagram detail expectations around 24-hour on-site advice, training and support, and access to private healthcare. Meta said the payout was above industry standards in the markets where they operated and used techniques such as blurring, sound muting and monochrome rendering to limit exposure to the artwork for people reviewing the content on the two platforms.

You may also like