Home Australia Ghouls turned murdered child into a Character.AI bot on ‘lonely hearts’ platform blamed for boy’s suicide

Ghouls turned murdered child into a Character.AI bot on ‘lonely hearts’ platform blamed for boy’s suicide

0 comments
Megan Garcia is pictured with her son Sewell Setzer III, who committed suicide in February after spending months talking to a Character.AI chatbot he fell in love with.

Character.AI, a platform that allows people to talk to a variety of artificial intelligence chatbots, has come under fire after one of its robots allegedly encouraged a teenager to commit suicide earlier this year.

A new lawsuit filed this week claimed that 14-year-old Sewell Setzer III was talking to a Character.AI classmate she had fallen in love with when he took his own life in February.

In response to a request for comment, Character.AI told Daily Mail.com that they were “creating a different experience for users under 18 that includes a stricter model to reduce the likelihood of encountering sensitive or suggestive content.”

But Character.AI faces other controversies, including ethical concerns about user-created chatbots.

Megan Garcia is pictured with her son Sewell Setzer III, who committed suicide in February after spending months talking to a Character.AI chatbot he fell in love with.

Noam Shazeer, left, and Daniel de Freitas, right, have had enormous success with Character.AI, a concept that Google reportedly rejected when they presented it to their superiors.

Noam Shazeer, left, and Daniel de Freitas, right, have had enormous success with Character.AI, a concept that Google reportedly rejected when they presented it to their superiors.

Drew Crecente lost her teenage daughter Jennifer in 2006 when her high school ex-boyfriend shot her to death.

Eighteen years after her murder, she discovered that someone used Jennifer’s name and image to resurrect her as a Character.AI character.

A spokesperson told DailyMail.com that Jennifer’s character was eliminated.

There are also two Character.AI chatbots that use George Floyd’s name and image.

Floyd was killed by Minneapolis police officer Derek Chauvin, who put his knee on Floyd’s neck for more than nine minutes.

“This character was created by the user and has been removed,” Character.AI said in its statement to DailyMail.com.

‘Character.AI takes security on our platform seriously and moderates characters proactively and in response to user reports.

‘We have a dedicated Trust and Safety team who review reports and take action in accordance with our policies.

‘We also conduct proactive detection and moderation in a number of ways, including using industry-standard block lists and custom block lists that we expand periodically.

“We are constantly evolving and refining our safety practices to help prioritize the safety of our community.”

Drew Crecente is pictured with her daughter Jennifer, who was murdered by her 18-year-old ex-boyfriend in 2006.

Drew Crecente is pictured with her daughter Jennifer, who was murdered by her 18-year-old ex-boyfriend in 2006.

A saved screenshot of 'Jennifer's' profile, which has since been deleted

A saved screenshot of ‘Jennifer’s’ profile, which has since been deleted

“We are working quickly to implement those changes for younger users,” they added.

As a loneliness epidemic grips the country, Sewell’s death has raised questions about whether chatbots that act as texting companions are doing more to help or harm. young people who use them disproportionately.

However, there is no doubt that Character.AI has helped its founders, Noam Shazeer and Daniel de Freitas, who now enjoy fabulous success, wealth and media recognition. Both men were named in the lawsuit filed against their company by Sewell’s mother.

Shazeer, who appeared on last year’s cover of Time magazine’s 100 Most Influential People in AI, has said that Character.AI will be ‘super, super useful‘ to people who struggle with loneliness.

On March 19, 2024, less than a month after Sewell’s death, Character.AI introduced a voice chat feature for all users, making role-playing games on the platform even more vivid and realistic.

The company initially introduced it in November 2023 as a beta version for its C.AI+ subscribers, including Sewell, who pay $9.99 per month for the service.

Time magazine cover story last year profiling leaders in artificial intelligence. Noam Shazeer's face is at the top right and is a circle. Also featured on the cover is Sam Altman, who founded OpenAI, the company that created ChatGPT.

Time magazine cover story last year profiling leaders in artificial intelligence. Noam Shazeer’s face is at the top right and is a circle. Also featured on the cover is Sam Altman, who founded OpenAI, the company that created ChatGPT.

so now Character.AI’s 20 million global users can speak 1:1 verbally with the AI ​​chatbots of their choice, and a large portion of them can carry on flirtatious or romantic conversations, as well as many Reddit users have testified.

“I use it mostly at night so I don’t have those lonely or anxious feelings. It’s nice to fall asleep without feeling lonely or desperate, even if it’s just a little fake roleplay,” one person wrote in the archive. Reddit thread. “It’s not perfect, as I would prefer to have a real partner, but my options and opportunities are limited right now.”

Another wrote: ‘I use it mainly for therapeutic purposes and also as role play. But romance appears from time to time. I don’t care if they come from my comfort characters.’

An example of one of the AI ​​chatbots offered at Character.AI. It shows that 149.8 million messages have been sent to this particular character.

An example of one of the AI ​​chatbots offered at Character.AI. It shows that 149.8 million messages have been sent to this particular character.

Shazeer and de Freitas, who were once software engineers at Google, founded Character.AI in 2021. Shazeer serves as CEO, while de Freitas is the company’s president.

They left Google after it refused to launch its chatbot. CNBC reported.

During an interview at a technology conference in 2023, de Freitas said that he and Shazeer were inspired to leave Google and start their own company because “there’s too much brand risk in big companies to launch something fun.”

They continued to have enormous success, with Character.AI reaching a $1 billion valuation last year following a fundraising round led by Andreesen Horowitz.

According to a report last month in The Wall Street JournalGoogle wrote a $2.7 billion check to license Character.AI’s technology and rehire Shazeer, de Freitas and several of their researchers.

Following the lawsuit over Sewell’s death, a Google spokesperson told the NYT that its licensing agreement with Character.AI does not give it access to any of its chatbots or user data. The spokesperson also said that Google has not incorporated any of the company’s technology into its products.

Shazeer has become the forward-looking executive, and when asked what the company’s stated goals are, he often gives a variety of answers.

In an interview with Axios HeadquartersShazeer said: ‘We’re not going to think about all the big use cases. There are millions of users out there. They come up with better things.

Shazeer has also said that he wants to create personalized superintelligence that is cheap and accessible to everyone, an explanation similar to the mission expressed on the “About Us” page on Character.AI website.

The 'About Us' page on the Character.AI website, which explains the company's mission.

The ‘About Us’ page on the Character.AI website, which explains the company’s mission.

But amid its attempt to become more accessible, Character.AI also has to deal with many copyright claims, as many of its customers create chatbots that use copyrighted material.

For example, the company removed the Daenerys Targaryen character that Sewell was chatting with in part because HBO and others own the copyright.

In addition to its current scandals, Character.AI could face even more criticism and legal problems in the future.

Attorney Matthew Bergman, who represents Garcia in his lawsuit against the company, told DailyMail.com that he has heard from many other families who have had children who were negatively affected by Character.AI.

He declined to offer exactly how many families he has spoken to, citing the fact that their cases are “still in preparation mode.”

Bergman also said that Character.AI should be completely removed from the Internet because “it was released to the market before it was safe.”

However, Character.AI also highlighted that there were “two ways to report a character.”

‘Users can do this by going to the character’s profile picture and clicking “created by” on the “report” button.

‘Or they can go to the Safety Center and at the bottom of the page there is a link to “submit a request.”‘

You may also like