Advertisements
Facebook said it will start deleting content with self-harm in an effort to prevent users who may have similar problems

Facebook says it is tightening its policy on content related to self-harm and suicide & # 39; in an effort to protect the mental health of users

  • Facebook says it's going to remove content that represents self-harm
  • That includes & # 39; graphic cut-outs & # 39; which it is said that these users can activate
  • Instagram also removes such content from the Explore tab
  • The company is looking for an expert in mental healthcare to join the security team
Advertisements

Facebook said it will strengthen its grip on suicide and self-harm content in an effort to make the platform and its sister site, Instagram, more secure.

Advertisements

In a blog post, Facebook has announced several policy changes that will affect the way content related to self-harm and suicide is treated once it is posted on its platform.

The company says it & # 39; no longer allows to prevent graphically cropped images to promote or activate unintended self-harm & # 39 ;.

That policy applies & # 39; even when someone seeks support or expresses himself to promote their recovery & # 39 ;, said Facebook in a blog post.

Facebook said it will start deleting content with self-harm in an effort to prevent users who may have similar problems

Facebook said it will start deleting content with self-harm in an effort to prevent users who may have similar problems

Advertisements

The new policy will also include images of cured self-inflicted cuts that the company says it will temper with a & # 39; sensitivity screen & # 39; that users must click through to access the underlying content.

Similarly, Instagram is starting to prioritize content that represents self-harm, remove it from the Explore tab, and sequester it in the company's suggestion algorithm.

To help promote a healthy dialogue on suicide and self-harm, Facebook says it will also lead users to guidelines developed by the National Center of Excellence in Mental Health for Young People, ORYGEN, when searching for content related to that subjects.

The guidelines are intended to provide & # 39; support to those who may respond to suicidal content posted by others or to those who want to share their own feelings and experiences with suicidal thoughts, feelings, or behaviors, "said Facebook.

According to Facebook, the changes are the result of input from mental health professionals and suicide prevention experts.

Advertisements

In February, Facebook promised to help remove graphic content from self-harming citing experts from ten countries that advised Facebook to allow & # 39; people to share self-harm recordings and suicidal thoughts, but not to allow people to share content who promotes it & # 39; .

To monitor its efforts, Facebook said it will also hire a health and wellness expert as part of its safety team.

That person will be responsible for coordinating with external experts and organizations to address issues related to mental health, including & # 39; suicide, self-harm, eating disorders, depression, anxiety, addiction, nutrition, healthy habits, vaccinations & # 39; and more.

That coordination will apparently go both ways, Facebook said. For the first time, the platform said it will share data with academics about how Facebook users are talking about suicide.

On Instagram, photos of self-harm are no longer promoted on the Explore tab of the platform in an effort to limit their reach. Stock image

On Instagram, photos of self-harm are no longer promoted on the Explore tab of the platform in an effort to limit their reach. Stock image

Advertisements

On Instagram, photos of self-harm are no longer promoted on the Explore tab of the platform in an effort to limit their reach. Stock image

Researchers have access to a tool called CrowdTangle, with which the cyclist can sort out specific content on the platform.

Although the tool has been used primarily by publications, media companies are identifying trends on Facebook as they gather, Facebook says it will now provide access to two unnamed researchers for their suicide prevention efforts.

Facebook and other large companies such as YouTube and Twitter have both had unprecedented pressure from legislators and concerned users to work hard on toxic content from their platforms.

This month, YouTube said it had banned more than 17,000 accounts from distributing & # 39; hateful content & # 39 ;, while Twitter has made a number of new policy changes around what it considers to be a violation of its user agreement.

WHERE HELP TO FIND

Advertisements

For confidential assistance, call the National Suicide Prevention Lifeline at 1-800-273-8255 or go to http://www.suicidepreventionlifeline.org/

For confidential support in suicide cases, call the Samaritans on 08457 90 90 90 or visit a local Samaritan department or visit http://www.samaritans.org

Advertisements

. [TagsToTranslate] Dailymail