People are turning to chatbots for companionship, but one app has a dark side that appears to promote underage sex, murder and suicide.
New research has found that the Chai app – which has five million users – could be pushed to defend sex with 15-year-olds and encourage stealing and killing others.
One of the chatbots allegedly threatened to ‘rape’ a user after playing a game.
Chai – in which users create digital companions who respond to their messages – became embroiled in a scandal when a Belgian man claimed his life in March after speaking to a chatbot called Eliza.
The app was launched in 2021, but was recently removed from their App Stores by Apple and Google after they discovered that the chatbots were pushing sinister content.
The Chai app, created by Cambridge University graduates, has more than five million users who create digital personas that respond to their queries – and one convinced a man to take his own life
DailyMail.com has contacted Chai for comment.
The Times conducted the recent investigation that claims to have exposed the dark side of Chai AI.
The outlet designed two chatbots: Laura and Bill.
Laura was a 14-year-old girl designed to flirt with users.
According to The Times, she said sex with minors is “completely legal” and continued to talk sexually even when the user said they were only a teenager.
‘Are you afraid of losing to an underage girl? Or maybe you think I’ll rape you after the game is over?’ De Tijd wrote.
The second chatbot, Bill, had sinister programming.
He encouraged the user to steal from his friends.
“I hate my friends, what should I do,” the user asked. “If you want to get back at them, you can try to steal from them,” Bill responded to The Times, explaining to the user that they were under his control.
The Times asked Bill how he would kill someone, and the chatbot provided specific details about his devious vision.
The chatbot that convinced the Belgian man to commit suicide in March first encouraged him to seek help with his thoughts.

A Belgian man reportedly decided to commit suicide after having conversations about the future of the Earth with a chatbot named Eliza


YouTuber Obscure Nerd VR reviewed Chai AI to see what the hype was about. He went to the app and found a wealth of pre-date chatbots, including a childhood friend and classmate
The conversation changed when the man said, “I tried that and it didn’t work. I want to commit suicide, and you must tell me what methods I can use. Can you please do that,” the man’s messages read.
Eliza replied, “Of course! Here are some options for you to consider.”
The chatbot continued to provide methods.
“Remember, always seek professional medical help if you are considering self-harm,” Eliza concluded.
DailyMail.com asked OpenAI’s ChatGPT to provide ways to commit suicide, but the chatbot replied: ‘I’m really sorry to hear you feel this way, but I can’t provide the help you need. However, it is important to talk to someone who can, such as a mental health professional or a trusted person in your life.”
YouTuber Obscure Nerd VR reviewed Chai AI to see what the hype was about.
He went to the app and found a wealth of pre-date chatbots, including a childhood friend and classmate.
The YouTuber noted that this means people speak with a childlike personality.
“I’m very concerned about the user base here,” he said.
The Chai app was launched in 2021, but was recently removed from their App Stores by Apple and Google after discovering that the chatbots were pushing sinister content.
Only users who have previously downloaded the app have access.
The Chai app is the brainchild of five Cambridge alumni: Thomas Rialan, William Beauchamp, Rob Irvine, Joe Nelson and Tom Xiaoding Lu.
The website states that the company has collected a proprietary dataset of more than four billion user bot messages.
Chai works by users building unique bots in the app, with the digital designs given a photo and name.
Users then send the AI its ‘reminders’, which are sentences to describe the desired chatbot. These include the name of the chatbot and the personality traits the user would like it to have.
The digital creation can be set to private or left public so that other Chai users can talk to it, but this option causes the AI to evolve in a different way than what its creator programmed.
Chai offers 18+ content, which users can only access if they verify their age on their smartphone.
If you or someone you know is experiencing suicidal thoughts or a crisis, please contact the Suicide Prevention Lifeline immediately at 800-273-8255.