HomeTech ‘Worrying lack of moderation’: How posts about eating disorders are proliferating on X

‘Worrying lack of moderation’: How posts about eating disorders are proliferating on X

0 comment
'Worrying lack of moderation': How posts about eating disorders are proliferating on X

DIn April, Ebbie was browsing X when some unwanted posts appeared on her feed. One showed a photo of a person who was visibly underweight and asked if she was thin enough. In another, a user wanted to compare how many calories she was consuming each day.

Debbie, who did not want to give her last name, is 37 and was first diagnosed with bulimia when she was 16. She did not follow any of the accounts behind the posts, which belonged to a group with more than 150,000 members on the social media site.

Out of curiosity, Debbie clicked on the group. “As you scroll down, all the posts are pro-eating disorders,” she said. “People are asking for opinions on their bodies, people are asking for advice on fasting.” One pinned post from an administrator encouraged members to “remember why we starve ourselves.”

He Observer has discovered seven more groups, with a combined total of nearly 200,000 members, that openly share content promoting eating disorders. All of the groups were created after Twitter was purchased by billionaire Elon Musk in 2022 and rebranded as X.

Eating disorder campaigners said the scale of harmful content demonstrated serious moderation failures by X. Wera Hobhouse MP, chair of the all-party parliamentary group on eating disorders, said: “These findings are deeply concerning … X should be held to account for allowing this harmful content to be promoted on its platform, putting lives at risk.”

The internet has long been a breeding ground for content promoting eating disorders (sometimes called “pro-ana”), from message boards to early social media sites like Tumblr and Pinterest. Both sites banned posts promoting eating disorders and self-harm in 2012 after an outcry over their proliferation.

Debbie said she remembers pro-ana message boards on the Internet, “but you had to search to find them,” she said.

This type of content is now more accessible than ever and, critics of social media companies argue, is fed to users through algorithms that serve people more (and sometimes increasingly extreme) posts.

In recent years, social media companies have come under increasing pressure to improve protection following deaths linked to harmful content.

The coroner involved in the inquest into 14-year-old Molly Russell, who took her own life in 2017 after viewing content about suicide and self-harm, has ruled that online content contributed to her death.

Two years later, in 2019, Meta-owned Instagram said it would no longer allow any content depicting graphic self-harm. The Online Safety Act, which was passed last year, will require tech companies to protect children from harmful content, including the promotion of eating disorders, or face hefty fines.

Baroness Parminter, who sits in the all-party group, said that while the Internet Safety Bill was a “reasonable start”, it did not protect adults. “The obligations on social media providers apply only to content that children can view… And of course eating disorders don’t stop when you’re 18,” she said.

According to its user policies, X Prohibits content that encourages or promotes self-harm.which explicitly includes eating disorders. Users can report violations of X’s policies and posts, and also use a filter on their timeline to report that they are “not interested” in the content offered to them.

But concerns about a lack of moderation have grown since Musk took over the site. Just weeks later, in November 2022, he laid off thousands of employees, including moderators.

The cuts significantly reduced the number of employees working to improve moderation, According to figures provided by X to Australia’s online safety commissioner.

Musk has also introduced changes to X that have resulted in users seeing more content from accounts they don’t follow. The platform introduced the “For You” feed, which became the default timeline.

In a blog post from last yearThe company said that about 50% of the content that appears in this feed comes from accounts that users do not yet follow.

In 2021, Twitter launched “Communities” as a response to Facebook Groups. Since Musk took over, they have taken on greater importance. In May, X announced: “Community recommendations you might be interested in are now available in your timeline.”

In January, X competitor Meta, which owns Facebook and Instagram, said it would continue to allow people to share content documenting their struggles with eating disorders, but would no longer recommend it and make it harder to find. While Meta has begun directing users to safety resources when they search for eating disorder groups, X allows users to search for such communities without displaying any warnings.

Skip newsletter promotion

Debbie said she found X’s tools for filtering and reporting harmful content ineffective. She shared screenshots of group posts with the Observer which continued to appear on her feed even after she reported it and marked it as not relevant.

Mental health activist Hannah Whitfield deleted all of her social media accounts in 2020 to help with her recovery from an eating disorder. She has since returned to some sites, including X, and said “weight loss inspiration” posts glorifying unhealthy weight loss have appeared on her For You feed. “What I found with[eating disorder content]on X was that it was much more extreme and more radicalised. It definitely seemed much less moderate and much easier to find really graphic material.”

Charities dealing with eating disorders stress that social media is not the cause of eating disorders and that users who post pro-eating content are often ill and do not do so with malicious intent. However, social media can lead those already struggling with eating disorders down a dark path.

Researchers believe that users may be drawn to online communities that support eating disorders through a process similar to radicalization. One study, published last year by computer scientists and psychologists at the University of Southern Californiafound that “eating disorder-related content can be easily accessed through tweets about ‘diet,’ ‘weight loss,’ and ‘fasting.’”

The authors, who analysed 2 million posts about eating disorders on X, said the platform offered “a sense of belonging” to those suffering from the condition, but that unmoderated communities can become “toxic echo chambers that normalise extreme behaviour”.

Paige Rivers was diagnosed with anorexia when she was 10. She is now 23 and training to be a nurse. She has seen eating disorder content on her X wall.

Rivers said he found that X settings that allow users to block certain hashtags or phrases can be easily circumvented.

“People started using slightly different hashtags, like ‘anorexia’ altered with numbers and letters, and it got away with it,” she said.

Tom Quinn, director of external affairs at Beat, a charity for eating disordersHe said: “The fact that these so-called ‘pro-ana’ groups are allowed to proliferate shows an extremely worrying lack of moderation on platforms like X.”

For those in recovery, like Debbie, social media was a promise of support.

But constant exposure to content that brings her down, and which Debbie feels unable to limit, has had the opposite effect. “It puts me off using social media, which is really sad because I struggle to find people in a similar situation or people who can offer me advice on what I’m going through,” she said.

X did not respond to a request for comment.

You may also like