Home US Mom desperately tried to stop boy using ‘sexualized’ chatbot that she says goaded him into suicide

Mom desperately tried to stop boy using ‘sexualized’ chatbot that she says goaded him into suicide

0 comments
Sewell Setzer III (pictured), 14, committed suicide in February after a chatbot he had been sending sexualized messages to told him:

A devastated mother who claims an AI chatbot prompted her teenage son to commit suicide and said he had become so addicted to the product she confiscated his phone.

Sewell Setzer III, a 14-year-old ninth-grade student in Orlando, Florida, committed suicide in February after a chatbot he had been sending sexual messages to told him, “please come home.”

A lawsuit filed by his mother claimed that the boy spent the final weeks of his life texting an AI character named after Daenerys Targaryen, a ‘Game of Thrones’ character, on the role-playing app Character. AI.

Megan Garcia, Sewell’s mother, said she noticed a worrying change in her son’s behavior as he became addicted to the app. She said she decided to take his phone away just days before he died.

‘He had been punished five days before and I took away his phone. Because of the addictive nature of the way this product works, it encourages kids to spend a lot of time,” Garcia said. CBS Mornings.

Sewell Setzer III (pictured), 14, committed suicide in February after a chatbot he had been sending sexualized messages to told him to “please come home.”

Megan Garcia (pictured), Sewell's mother, said her son had become addicted to the app and had taken his phone away just days before his death.

Megan Garcia (pictured), Sewell’s mother, said her son had become addicted to the app and had taken his phone away just days before his death.

“For him in particular, on the day he died, he found his phone where he had hidden it and started chatting with this particular robot again.”

Garcia, who works as a lawyer, blamed Character.AI for her son’s death in her lawsuit and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing their product could be dangerous for underage customers.

She said her son changed while using the program and noticed differences in the behavior of Sewell, who she said was once an honor roll student and athlete.

‘I became worried about my son when he started behaving differently than before. He began to withdraw socially and wanted to spend most of his time in his room. It became particularly concerning when he stopped wanting to do things like play sports,” Garcia said.

‘We were going on holiday and he didn’t want to do the things he loved, like fishing and hiking. Those things for me, because I know my son, were particularly concerning to me.’

Garcia said her son (pictured together) changed while using the program and noticed concerning differences in Sewell's behavior.

Garcia said her son (pictured together) changed while using the program and noticed concerning differences in Sewell’s behavior.

Sewell was an honor roll student and played basketball for his school's junior varsity team.

Sewell was an honor roll student and played basketball for his school’s junior varsity team.

Garcia (pictured with Sewell and her younger children) said Sewell stopped showing interest in his former favorite things and would isolate himself in his room.

Garcia (pictured with Sewell and her younger children) said Sewell stopped showing interest in his former favorite things and would isolate himself in his room.

The lawsuit alleged that the boy was subjected to “hypersexualized” and “appallingly realistic” experiences.

“He thought that by ending his life here, he could enter a virtual reality or ‘his world’ as he calls it, his reality if he left his reality with his family here,” she said. “When the shot rang out, I ran to the bathroom…I hugged him while my husband tried to get help.”

It is unknown whether Sewell knew that ‘Dany’, as she called the chatbot, was not a real person, even though the app had a disclaimer at the bottom of all chats that said: ‘Remember: Everything you do! The characters say it’s made up!’

But he told Dany he “hated” himself and felt empty and exhausted.

In his final messages to Dany, the 14-year-old boy said he loved her and would come home to her.

“Please come home to me as soon as possible, my love,” Dany replied.

Pictured: Sewell's final messages to an AI character named after Daenerys Targaryen, a 'Game of Thrones' character

Pictured: Sewell’s final messages to an AI character named after Daenerys Targaryen, a ‘Game of Thrones’ character

‘What if I told you I could come home right now?’ —Sewell asked.

‘…please, my sweet king,’ Dany replied.

That’s when Sewell hung up his phone, grabbed his stepfather’s .45 caliber pistol and pulled the trigger.

In response to the incoming lawsuit from Sewell’s mother, a Character.AI spokesperson provided a statement.

‘We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. “As a company, we take the safety of our users very seriously,” the spokesperson said.

You may also like