Home Tech How TikTok bots and AI have fueled a resurgence of far-right violence in the UK

How TikTok bots and AI have fueled a resurgence of far-right violence in the UK

0 comment
How TikTok bots and AI have fueled a resurgence of far-right violence in the UK

ILess than three hours after Monday’s stabbing attack that left three children dead, an account called Europe Invasion shared an AI-generated image on X. It showed bearded men in traditional Muslim garb outside parliament, one of them brandishing a knife, behind a crying child in a Union Jack T-shirt.

The tweet, which has since been viewed 900,000 times, was captioned: “We must protect our children!” and was shared by one of the most powerful disinformation accounts on the Southport stabbings.

AI technology has been used in other ways, including an anti-immigration Facebook group that illustrated a call to attend a demonstration in Middlesbrough by generating an image of a large crowd at the town’s cenotaph.

Platforms such as Suno (which uses artificial intelligence to generate music with voices and instruments) have been used to create online songs that combine references to Southport with xenophobic content. Among the titles is “Southport Saga”, in which a female AI voice sings lyrics such as “hunt them down somehow”.

Experts have warned that new tools and forms of organisation have allowed Britain’s fractured far right to use the Southport attack to unify and rejuvenate its presence on the streets.

In a surge of activity not seen in years, more than 10 protests are being promoted on social media platforms such as X, TikTok and Facebook in the wake of violent unrest across the country.

Death threats against the UK prime minister, incitement to attack government property and extreme anti-Semitism were also among the comments on a far-right group’s Telegram channel this week.

Amid fears of further violence, a leading counter-extremism think tank warned there was a risk the far right could achieve a mobilisation not seen since the English Defence League (EDL) took to the streets in the 2010s.

A new dimension has come with the arrival of easily accessible artificial intelligence tools that extremists have been using to create materials ranging from provocative images to songs and music.

Andrew Rogoyski, director of the Human-Centred Artificial Intelligence Institute at the University of Surrey, said advances – with image-generating tools now widely available online – mean “anyone can create anything”.

He added: “The ability of anyone to create powerful images using generative AI is a cause for enormous concern. The onus then falls on the providers of such AI models to reinforce the guardrails built into the model and make it harder for such images to be created.”

Joe Mulhall, research director at campaign group Hope Not Hate, said the use of AI-generated material was in its infancy but reflected the growing overlap and collaboration online between a range of individuals and groups.

While far-right organisations such as Britain First and Patriotic Alternative remain at the forefront of mobilisation and agitation, equally important are a number of individuals unaffiliated with any particular group.

“They are made up of thousands of people who offer micro-donations of time and sometimes money to collaborate towards common political goals, completely outside of traditional organisational structures,” Mulhall said. “These movements lack formal leaders, but instead have representative figures, often drawn from a growing selection of far-right social media influencers.”

In terms of promoting the protests, the hashtag #enoughisenough has been used by some right-wing influencers according to Joe Ondrak, a senior analyst at Logically, a UK company that monitors disinformation.

“The key thing is how that phrase and that hashtag has been used in previous anti-immigrant activism,” he said.

Analysts also highlighted the use of bots. Tech Against Terrorism, an initiative launched by a branch of the UN, cited a TikTok account that began posting content only after the Southport attack on Monday.

“All of the posts were related to Southport, with the majority calling for protests near the site of the attack on 30 July. Despite having no prior content, the Southport-related posts racked up over 57,000 views on TikTok alone within hours,” a spokesperson said. “This suggests that botnets were actively promoting this material.”

A central role is played by a group of individuals and groups around Tommy Robinson, the far-right activist who fled abroad earlier this week ahead of a court hearing. They include Laurence Fox, the actor-turned-right-wing activist who has been spreading misinformation in recent days, and conspiracy theory websites such as Unity News Network (UNN).

On the UNN channel on Telegram – a largely unmoderated messaging platform – commentators gloated at the violence seen outside Downing Street on Wednesday. “I hope they burn it to the ground,” said one. Another called for the hanging of Keir Starmer, the prime minister, saying: “Starmer needs the Mussalini treatment (sic).”

Among those seen on the ground during the Southport riots were activists from Patriotic Alternative, considered one of the fastest-growing far-right groups in recent times. Other groups, including those that have been divided by their positions on conflicts such as the war in Ukraine or Israel, have also sought to take part.

Dr Tim Squirrell, communications director at the Institute for Strategic Dialogue, a counter-extremism think tank, said the far right had been seeking to mobilise on the streets over the past year, including on Armistice Day and at Robinson film screenings.

“It’s a feverish environment and it’s only exacerbated by the health of the online information environment, which is the worst it’s been in years,” he said.

“Robinson remains one of the most effective organisers of the UK far right. However, we have also seen the rise of accounts, large and small, that curate news stories that appeal to anti-immigrant and anti-Muslim sensibilities, and are not concerned about spreading unverified information.

“There is a risk that this moment could be used to try to create a street mobilization more similar to that of the 2010s.”

You may also like