W.When James Irungu accepted a new job for technology outsourcing company Samasource, his manager provided him with few details before his training began. But the position was highly sought after and would almost double his salary to £250 a month. It also offered him a way out of Kibera, the vast shantytown on the outskirts of Nairobi where he lived with his young family.
“I thought I was one of the lucky ones,” the 26-year-old said. But then he found himself sifting through reams of violent and sexually explicit material, including gruesome accidents, suicides, beheadings and child abuse.
“I remember one day I went online and saw a child with his stomach open, suffering but not dead,” the Kenyan national told The Guardian. It was when I saw material about child exploitation “that I really realized that this was something different.”
Samasource had hired him to moderate Facebook content, removing the most toxic posts. Some of the most tormenting images remained etched in his mind, occasionally waking him up with night sweats. Fearing that talking about his work would cause discomfort, concern, or judgment from others, he kept it to himself.
Exasperated by his “secrecy,” his wife distanced herself. Irungu resigned himself to their separation, convinced he was protecting her, and stayed on the job for three years. He says he regrets moving forward.
“I don’t think the job is suitable for human beings,” he said. “It really isolated me from the real world because I started to see it as a very dark place.” He was afraid of losing sight of his daughter.
“When I ask myself if it was really worth sacrificing my mental health for the money, the answer is no.”
Another former moderator said she was alarmed by some of the content and that some coworkers quit. But he found purpose in his managers’ assurances that their work protected users, including young children like his own.
“I felt like I was helping people,” he said. But when he stopped, he realized that the things he had normalized were worrying.
He remembered once screaming in the middle of the office floor after seeing a horrible scene. Except for a few looks from her coworkers and a team leader who took her aside to “go to wellness therapy,” it was like nothing happened, she said. Wellness counselors told him to take some time to rest and get the image out of his head.
“How do you forget, when you’re back on the court after a 15-minute break, to move on to the next thing?” she said. He questioned whether the counselors were qualified psychotherapists and said they would never escalate a case for mental health care no matter what the moderators had seen or how distressed they were.
She went from being the type of person who welcomed friends at every opportunity to barely leaving her house, crying over the deaths of people she didn’t know, and feeling numb and mentally disturbed, sometimes battling suicidal thoughts.
“The work damaged me, I could never do it again,” said the woman, who hopes the case will have an impact on the content moderation industry in Africa, as global demand for this type of service grows.
“Things have to change,” he said. “I would never want anyone to go through what we went through.”