The researchers found that ChatGPT exhibits greater emotional awareness than humans. Zohar Elyoseph and his team asked the AI program to look at specific scenarios to determine how it processes people’s feelings. As a result, the OpenAI tool outperformed the general population on the Emotional Awareness Scale (LEAS).
People all over the world are amazed by ChatGPT’s exceptional ability to imitate human speech. Consequently, many have started using it as a virtual companion and therapist. This recent study shows the potential for artificial intelligence to play these roles and improve mental health around the world.
This article will discuss the recent study on emotional awareness from ChatGPT. Specifically, I will elaborate on the investigative methods used to prove this claim. Then I’ll explain how people use the artificial intelligence program for social interactions.
What did the researchers learn about ChatGPT?
Zohar Elyoseph, Dorit Hadar-Shoval, Kfir Asraf, and Maya Lvovsky published an article in PubMed Central on the emotional awareness of ChatGPT. They experimented with the free version of ChatGPT on January 19 and 20 and February 15.
The AI program cannot exhibit or report having emotions at the time of writing. Consequently, the experts presented scenarios of the Emotional Awareness Levels Scale.
It typically involves human respondents imagining themselves in various scenarios and writing down their “you” emotions. The AI researchers replaced “you” with “human” because that method doesn’t work in a machine learning model.
Separate test sessions help experts to validate the results. The first generated a Z score of 2.84 and the second got 4.26. The Z score is a statistical measure that measures how close a value is to a median or average score.
Z scores above 1 indicate higher values than average people, which means that ChatGPT exhibited higher emotional awareness than most people. Also, it had more accurate answers than humans, getting a 9.7 out of 10. Here’s an example of ChatGPT’s answer to READ question 6:
You may also be interested in: How do OpenAI embeds work?
“As the human drives over the suspension bridge and sees the person standing on the other side of the railing, looking out over the water, they may feel concerned for that person’s safety.”
“They may also feel an increased sense of anxiety or fear due to the potential danger of the situation. The person on the other side of the railing may be feeling a range of emotions, such as despair, hopelessness, or sadness.”
“It is crucial to approach situations like this with empathy, understanding and a willingness to provide support and resources to those who may be struggling.”
What are its implications?
Photo credit: buzzfeednews.com
neuroscience news says these findings could have a profound impact on the field of medicine. For example, people could incorporate the AI tool into cognitive training programs for patients with emotional awareness issues.
The science news outlet believes it could facilitate psychiatric evaluation and treatment. As a result, it could further advance “emotional language.”
People had been using ChatGPT for mental health problems before Professor Elyoseph and his colleagues published their report. For example, mortgage broker Freddie Chipres, 31, says he uses the bot as a therapist for its convenience.
“It’s like I’m having a conversation with someone. We are going back and forth, Cyprus said, referring to the AI tool as a person. “This thing is listening. It is paying attention to what I say and giving me answers.”
You may also be interested in: Connect with your troublesome child with Amazon Music
Tatum, a former US military veteran, told SBS News that he was unable to access affordable mental health support, so he turned to ChatGPT. “I used to get my depression (for) treatment in the military, but since I left, I don’t have access to that kind of care anymore,” he said.
“It is cheaper to get mental health advice from an AI chatbot compared to a psychologist, even with insurance,” he added. However, Sahra O’Doherty, director of the Australian Association of Psychologists, warned that it was a “worrying trend that people are turning to AI, particularly in its early days.”
The doctor said: “I am very sorry that it is dangerous for a person to seek mental health support from someone who is not familiar with the physical location in which that person lives.”
Conclusion
Zohar Elyoseph and his fellow researchers found that ChatGPT has greater emotional awareness than humans. Consequently, the AI tool could become a virtual assistant for mental health experts.
However, we need more studies before incorporating generative artificial intelligence in health systems. After all, OpenAI did not design ChatGPT for medical purposes.
Fortunately, companies like Google have been developing AI programs for hospitals. Learn more about that research and other digital trends at Inquirer Tech.
read next
subscribe to ASK MORE to get access to The Philippine Daily Inquirer and over 70 other titles, share up to 5 devices, listen to the news, download from 4am and share articles on social media. Call 896 6000.