Home Tech Ofcom urged to act after US company claims Roblox is a ‘pedophile hellhole’

Ofcom urged to act after US company claims Roblox is a ‘pedophile hellhole’

0 comments
Ofcom urged to act after US company claims Roblox is a 'pedophile hellhole'

Child safety campaigners have urged the UK’s communications watchdog to make a “radical shift” in implementing new online laws after a video games company was accused of turning its platform into a “hell.” “X-rated pedophile.”

Roblox, a gaming platform with 80 million daily users, was accused last week of lax security controls by a US investment firm.

Hindenburg Research claimed that Roblox games exposed children to bullying, pornography, violent content and abusive speech. The company, which has stated that it seeks to profit from the drop in Roblox’s share price taking a so-called “short” position on the company’s sharesHe added that he had found multiple accounts with names using variations of Jeffrey Epstein, the disgraced financier and child sexual abuser, and had been able to set up an account under the name of a notorious American pedophile.

“We discovered that Roblox is an small”. Hindenburg said.

Roblox rejected Hindenburg’s accusationssaying safety and civility were “fundamental” to the company.

“Every day, tens of millions of users of all ages have safe, positive experiences on Roblox and adhere to the company’s community standards. However, any security incident is horrible. “We take very seriously any content or behavior on the platform that does not meet our standards,” the company said.

The company said it had reviewed references to child safety in the report and found that in “many cases” the highlighted content had already been removed, while all other content mentioned in the report was being reviewed or had been removed.

“We continually evolve and improve our security approach to detect and prevent malicious or harmful activity. “This includes text chat filters to block inappropriate words and phrases, and not allowing user-to-user image sharing on Roblox,” the company said.

One in five Roblox users is under nine years old and the majority are under 16 years old. The platform offers a catalog of games and allows players to socialize with each other, even in chat rooms. There is no age limit, although the platform does announce age recommendations for certain “experiences” and offers. parental controls.

Roblox content is not written by its developers, but by the players. It provides the tools for children and teens to create their own simple game scenarios and then play them with friends. One popular Roblox “experience” has players working in a pizza restaurant and another involves a game of cops and robbers.

Child safety campaigners said the report underlines the need for Ofcom, the UK’s communications regulator, to implement the Online Safety Act as rigorously as possible and introduce strict codes of practice for technology companies.

Platforms are required by law to protect children from harmful content, and these provisions are based on codes of practice drawn up by Ofcom, which is responsible for enforcing the legislation. The codes are voluntary, but businesses that comply with them will be deemed to comply with Ofcom law.

The Molly Rose Foundation, set up by the parents of Molly Russell, the British teenager who took her own life after viewing harmful content online, said the watchdog would be judged by how quickly it addressed the risks posed by platforms such as Roblox.

Andy Burrows, chief executive of the foundation, said: “This report underlines growing evidence that shortcomings in child safety are not a technical problem but rather a systemic failure in the way online platforms are designed and run. .

“The Online Safety Act remains the most effective route to keeping children safe, but these avoidable safety failures will only be addressed if Ofcom steps up its ambition and determination to act.”

Beeban Kidron, a child internet safety advocate, said implementation of the law needed to “significantly raise the game” to ensure tech platforms have built-in safety measures.

“Roblox is a consumer-facing product and to market it it has to be safe for children, and it has to have design mechanisms that mean it doesn’t allow predators to gather or seek out children,” he said.

Lady Kidron added: “We need political will and leadership to strengthen the OSA provisions and a regulator willing to implement them.”

An Ofcom spokesperson said the law would have a significant impact on online safety in the UK and the regulator would have a wide range of enforcement powers to protect users.

“Platforms, such as Roblox, will be required to protect children from pornography and violence, take steps to prevent grooming, remove images of child abuse and introduce strict age controls. “We have set out clear recommended measures for how these requirements can be met in our draft codes.”

A Roblox spokesperson added that the company intended to be “fully” compliant with the OSA.

“Our internal teams have been assessing the obligations and have been involved in the various consultations and calls for evidence that Ofcom has published. “We look forward to seeing Ofcom’s final codes of practice,” the spokesperson said.

You may also like