Home Politics TikTok has a Nazi problem

TikTok has a Nazi problem

0 comment
TikTok has a Nazi problem

ISD reported this account, along with 49 other accounts, in June for violating TikTok Policies TikTok has published a list of major copyright violations by TikTok accounts, including those related to hate speech, promoting violence against protected groups, promoting hateful ideologies, celebrating violent extremists, and Holocaust denial. In all cases, TikTok found no violations, and all accounts were initially allowed to remain active.

A month later, TikTok had banned 23 of the accounts, indicating that the platform is removing at least some content and channels that violate its rules. Before they were removed, the 23 banned accounts had accumulated at least 2 million views.

The researchers also created new TikTok accounts to understand how TikTok’s powerful algorithm promotes Nazi content to new users.

Using an account created in late May, the researchers watched 10 videos from the network of pro-Nazi users, occasionally clicking into the comments sections but not engaging in any real interaction, such as liking, commenting, or favoriteing. The researchers also viewed 10 pro-Nazi accounts. When the researchers moved to the For You section within the app, it took just three videos for the algorithm to suggest a video featuring a World War II Nazi soldier superimposed over a graph of murder rates in the United States, with perpetrators broken down by race. Later, a video of an AI-translated Hitler speech superimposed over a recruiting poster for a white nationalist group appeared.

Another account created by ISD researchers saw even more extremist content promoted in its main feed, with 70 percent of videos coming from people who identified as Nazis or featured Nazi propaganda. After the account followed a number of pro-Nazi accounts to access content on channels set to private, TikTok’s algorithm also promoted other Nazi accounts to follow. The top 10 accounts TikTok recommended to this account all used Nazi symbology or keywords in their usernames or profile photos, or featured Nazi propaganda in their videos.

“This is not at all surprising,” says Abbie Richards, a disinformation researcher specialising in TikTok. “These are things we find over and over again. I have certainly found them in my research.”

Richards wrote about White supremacist content and militant accelerationism on the platform in 2022, including The case of neo-Nazi Paul Millerwho, while serving a 41-month sentence on firearms charges, appeared in a TikTok video that racked up more than 5 million views and 700,000 likes during the three months it was on the platform before being removed.

Marcus Bösch, a researcher at the University of Hamburg who monitors TikTok, told WIRED that the report’s findings “are not a big surprise” and that he’s not hopeful that TikTok can do anything to fix the problem.

“I’m not sure where the problem is exactly,” says Bösch. “TikTok says it has around 40,000 content moderators and it should be easy to understand such obvious policy violations. However, due to the sheer volume[of content]and the ability of malicious actors to adapt quickly, I am convinced that the entire disinformation problem cannot be solved definitively, neither with artificial intelligence nor with more moderators.”

TikTok says it has completed a mentoring program with Tech Against Terrorism, a group that seeks to disrupt the online activity of terrorists and helps TikTok identify online threats.

“Despite proactive measures taken, TikTok remains a target for exploitation by extremist groups as its popularity grows,” Adam Hadley, executive director of Tech Against Terrorism, tells WIRED. “ISD’s study shows that a small number of violent extremists can wreak havoc on large platforms due to adversarial asymmetry. This report therefore underscores the need for cross-platform threat intelligence supported by better AI-powered content moderation. The report also reminds us that Telegram must also be held accountable for its role in the online extremist ecosystem.”

As Hadley points out, the report’s findings show that there are significant gaps in the company’s current policies.

“I’ve always described TikTok, in regards to far-right use, as a messaging platform,” Richards said. “More than anything, it’s about repetition. It’s about being exposed to the same hateful narrative over and over again, because at a certain point you start to believe things after seeing them enough, and they really start to influence your worldview.”

You may also like