Today I heard about Intel’s AI sliders that filter abuse from online gaming


Last month during his virtual GDC presentation Intel has announced Beep, a new AI-powered tool that it hopes will reduce the amount of toxicity gamers must experience in voice chat. According to Intel, the app uses AI to detect and edit audio based on user preferences. The filter works on incoming audio and acts as an additional user-controlled moderation layer on top of what a platform or service already offers.

It’s a noble effort, but there’s something eerily funny about Bleep’s interface, which provides a detailed breakdown of all the different categories of abuse people can encounter online, combined with sliders to control the amount of abuse users want to hear. Categories range from ‘Aggression’ to ‘LGBTQ + Hate’, ‘Misogyny’, ‘Racism and Xenophobia’ and ‘White Nationalism’. There is even a switch for the N word. Bleep’s page notes that the public beta has yet to be introduced, so all of this is subject to change.

Filters include ‘Aggression’, ‘Misogyny’ …
Credit: Intel

… and a switch for the ‘N word’.
Statue: Intel

With most of these categories, Bleep seems to give users a choice: do you want none, some, most, or all of this offensive language to be filtered out? Like choosing from a buffet of toxic internet slurry, Intel’s interface gives players the option to sprinkle a light dose of aggression or swearing into their online gaming.

Bleep has been a few years in the making now – PCMag notes that Intel has long talked about this initiative at GDC 2019 – and it’s partnering with AI moderation specialists Spirit AI on the software. But moderating online spaces with the help of artificial intelligence is not easy, as platforms such as Facebook and YouTube have shown. While automated systems can identify downright insulting words, they often fail to take into account the context and nuance of certain insults and threats. Online toxicity comes in many, constantly evolving forms that can be difficult for even the most advanced AI moderation systems to recognize.

“While we recognize that solutions such as Bleep do not erase the problem, we believe it is a step in the right direction and gives gamers a tool to master their experience,” said Intel’s Roger Chandler during the GDC demonstration. Intel says it hopes to release Bleep later this year, adding that the technology relies on hardware-accelerated AI speech detection, suggesting the software can rely on Intel hardware to operate.