Advertisements
A report from The Guardian further highlights the dangers facing moderators who have the task of ridding the platform of toxic content. Photo file

Facebook moderators become addicted & # 39; addicted & # 39; report claims to extremist and marginal content as they work to free the platform of child pornography, violence and more

  • A new report highlights more dangers that Facebook's content moderators face
  • Some moderators become & # 39; addicted & # 39; extreme content, sources say
  • Others have had their political and social views influenced by content
  • The fight against child exploitation has become an important point of attention for moderators
  • Long hours, workloads and insufficient guidance have plagued moderators
Advertisements

Fighting the scourge of extreme content on Facebook takes a heavy psychological toll on some of the company's moderators, a new report says.

According to Berlin-based contractors interviewed by The Guardianthe constant exposure to the lower abdomen of the platform has led moderators & # 39; addicted & # 39; have been touched by graphic content, as a result of which some have collected troublingly offensive media for their own personal archives.

In some cases, sources say that the work has even influenced the political and social views of moderators, mainly due to the frequent consumption of fake news and hateful language that engulfs the platform.

A report from The Guardian further highlights the dangers facing moderators who have the task of ridding the platform of toxic content. Photo file

Advertisements

A report from The Guardian further highlights the dangers facing moderators who have the task of ridding the platform of toxic content. Photo file

Often sources in the report say their work also related to poring about Facebook & # 39; s private message app, in an effort to prevent sexual abuse against children – an algorithm used by Facebook marks conversations that it thinks are sexual exploitation.

& # 39; You understand something more of this kind of dystopian society that we build every day & # 39 ;, said a moderator quoted by The Guardian who asked to remain anonymous after signing a confidentiality agreement.

& # 39; We have rich white men from the US who write to children from the Philippines … they are trying to get sexual photos in exchange for $ 10 or $ 20. & # 39;

Contractors also complained about the huge amount of content they had to review.

According to The Guardian, moderators were told to meet a benchmark of assessing 1,000 items during an eight-hour shift, which amounts to one problem every 30 seconds.

That number has since been reduced to a quota of between 400 and 500 items per day following a blockbuster report from The Verge in February that first described the third moderation centers of Facebook in detail.

Advertisements

In that earlier report and subsequent follow-up in June, detailed casey Newton from The Verge in Tampa and Arizona, where employees at Facebook contractor Cognizant were confronted in the same way with tough working conditions and hours, often at the expense of their own mental health .

Companies contracted by Facebook have made mistakes for handling moderators in the US after Casey Newton reports from The Verge.

Companies contracted by Facebook have made mistakes for handling moderators in the US after Casey Newton reports from The Verge.

Companies contracted by Facebook have made mistakes for handling moderators in the US after Casey Newton reports from The Verge.

A particularly distressing incident described in the Newton report emphasized the death in the office of one of the company's employees, Keith Utley. Employees interviewed by The Verge say that the stress of the job contributed to a heart attack that led to his death.

Both reports also emphasized employees' criticisms of mental health professionals who said their ability to deal with employee stress and anxiety was insufficient.

Advertisements

While Facebook moderators play a crucial role in freeing the platform from toxic content, recent reports have worked to emphasize the human toll that the platform can and does on those in its job.

Sources in The Guardian suggest that Facebook should hire more employees to help reduce harmful effects on moderators.

HOW DOES TECH FIRMS FIGHT HATE SPEAK?

According to recent EU figures, Facebook, Twitter and Google & # 39; s YouTube have significantly accelerated their removal of online hate speech.

Microsoft, Twitter, Facebook and YouTube signed a code of conduct with the EU in May 2016 to assess most complaints within 24 hours.

Now the companies assess more than two-thirds of the complaints within 24 hours.

Advertisements

Of the hate speech to the companies, almost half were found on Facebook, the figures show, while 24 percent were on YouTube and 26 percent were on Twitter.

The most common reason for hatred identified by the Commission was ethnic origin, followed by anti-Muslim hatred and xenophobia, including hatred of migrants and refugees.

. [TagsToTranslate] Dailymail