HomeTech ‘I felt like I was talking to him’: Are dead AI people a blessing or a curse?

‘I felt like I was talking to him’: Are dead AI people a blessing or a curse?

0 comment
'I felt like I was talking to him': Are dead AI people a blessing or a curse?

When Christi Angel first spoke to a chatbot posing as her deceased partner, Cameroon, the encounter seemed surreal and “very strange.”

“Yes, I knew it was an artificial intelligence system but once I started chatting, I felt like I was talking to Cameroon. That’s how real it seemed to me,” she says.

However, the experience soon shook her. Angel’s conversation with “Cameroon” took a more sinister turn when the chatbot’s assumed persona said she was “in hell.” Angel, a practicing Christian, found the exchange upsetting and returned a second time seeking closure, which the chatbot provided.

technology/article/2024/jun/05/ai-researchers-build-future-self-chatbot-to-inspire-wise-life-choices"},"ajaxUrl":"https://api.nextgen.guardianapps.co.uk","format":{"display":0,"theme":4,"design":10}}" config="{"renderingTarget":"Web","darkModeAvailable":false,"updateLogoAdPartnerSwitch":true,"assetOrigin":"https://assets.guim.co.uk/"}"/>

“It was very disturbing. The only thing that made me feel better was when she said, or said, that she was not in hell.”

Angel, 47, from New York, is one of a growing number of people who have turned to artificial intelligence to cope with grief, a scenario made possible by advances in generative AI, the term for the technology. that produces compelling text, audio, or images from simple handwritten prompts.

His experience, and that of other people who have tried to relieve their pain with cutting-edge technology, is the subject of a documentary, Eternal youwhich premieres in the United Kingdom in the Sheffield Doc/Fest on Saturday. Its German directors, Hans Block and Moritz Riesewieck, say they find this use of AI problematic.

Jang Ji-sung in Eternal You, a documentary about duelingtech. Photography: PR

“These vulnerable people very quickly forget that they are talking to a machine learning system and that is a very big problem in regulating these types of systems,” Block says.

The platform used by Angel is called Project December and is operated by video game designer Jason Rohrer, who denies that his site is “death capitalism,” as Angel’s friend describes it in the film.

Rohrer says Project December started as an art project to create chatbot characters. It was then adopted by early users to recreate deceased couples, friends and family members. The website now advertises Project December with the title “faking the dead.” Customers are asked to fill in details about the deceased person, including nickname, character traits and cause of death, which are fed into an AI model. Rohrer says he charges $10 per user to cover operating costs and that “quite a few” people have been comforted by it.

“I’ve heard from several people who have said they found it helpful and thanked me for doing it,” he says, adding that some have also been “disappointed,” citing issues such as factual errors or lack of information. character responses.

Other examples of AI “grieftech” in the film include YOV, which stands for “You, Only Virtual” and allows people to build posthumous “versions” of themselves before they die so they can live digitally in chatbot or audio form. The American company can also create versions from the data of deceased people.

Justin Harrison, founder of YOV, created a version of his mother, Melodi, with her cooperation before she died in 2022. Harrison, 41, still converses with Melodi’s version, which can be updated with knowledge of the current events and remember previous discussions. creating what he describes as an “ever-evolving sense of comfort.”

When asked about ethical concerns about using AI to simulate dead people, he said YOV is fulfilling a timeless human need.

“Human beings have been remarkably consistent and universal in their desire to remain connected to their lost loved ones. “We are simply doing it with the tools that 2024 will allow us,” she says.

Sherry Turkle, a professor at the Massachusetts Institute of Technology (USA) specializing in human interaction with technology, warns that AI applications could make it impossible for the bereaved to “let go.”

“It is the unwillingness to cry. The session never has to end. “It is something we are inflicting on ourselves because it is a very seductive technology,” he says.

skip past newsletter promotion

There are positive examples in the documentary. Jang Ji-sung, 47, lost her seven-year-old daughter Nayeon to a rare illness in 2016 and consented to a TV show in her native South Korea producing a virtual reality version of her daughter. four years later. Footage from the reunion shows an excited Jang, wearing a virtual reality headset, interacting with her virtual son, who asks her, “Mom, did you think about me?” Jang tells The Guardian that he found the experience positive.

Jang says meeting Nayeon was beneficial as a “unique experience” after she lost her daughter so suddenly.

“If it somehow alleviates some of the guilt and pain, and you’re feeling pretty hopeless, then I would recommend it,” he says.

But Jang says he has no interest in reliving the experience with the advanced artificial intelligence technology now available. Once was enough.

“I can just miss her and write her a handwritten letter, leave it where her remains are and visit her there, instead of using these technologies,” he says.

Angel and Jang refer in passing to an episode of Charlie Brooker’s Black Mirror series, broadcast in 2013, in which a woman resurrects her dead lover from her online communications, including her social media activity.

The Black Mirror episode I’ll be back

Now that technology has caught up with fantasy, researchers at the University of Cambridge have called for mourning technology to be regulated. Dr Katarzyna Nowaczyk-Basińska, co-author of a recent study at the Leverhulme Center for the Future of Intelligence (LCFI) at the University of Cambridge, says she has concerns, including protecting the rights of people who donate their data to create posthumous avatars; the possibility of product placement in such services and harming the grieving process among specific groups, such as children.

“We are facing a huge technocultural experiment. We need much more responsible protective measures because there is a lot at stake: the way we understand and experience death and the way we care for the dead,” he states.

As with many advances in generative AI, there are also legal issues around the use of this technology, such as using people’s data to “train” the models that produce its images.

“As with everything related to AI, the law is untested, very complex and varies from country to country. “Users and platforms should think about rights over training data as well as outcomes and the various sources of regulation in the UK,” says Andrew Wilson-Bushell, a lawyer at UK law firm Simkins.

However, he says penalty technology probably faces a bigger challenge than laws related to copyright and intellectual property.

“I hope that the use of AI ghosts will be tested in the court of public opinion long before a legal challenge can take place.”

You may also like