New study claims AI ‘understands’ emotion better than us

In what seems to be a blow of a capacity in which we thought that computers would never outdo us, scientists now suggest that AI understands the emotions that we do better.
Scientists have found that AI understands the emotions better than us – marking much higher than the average person to choose the right answer to disseminate various emotional situations
In a new study published May 21 in the newspaper Communications psychologyScientists from the University of Geneva (UNIGE) and the University of Bern (UNIBE) have applied widely used emotional intelligence tests (IS) (IS) (IS) (IS)STEM,, Steu,, Genes mixtures,, GECO Regulation and GECO Management) To current models of large language (LLMS), including Chatgpt-4, Chatgpt-O1, Gemini 1.5 Flash, Claude 3.5 Haiku, Copilot 365 and Deepseek V3.
They were investigating two things: first, comparing the performance of AI and human subjects, and secondly, the ability to create new test questions that adhere to EI tests.
By studying the human responses validated from previous studies, the LLM have selected the “correct” response in emotional intelligence tests 81% of the time, based on the opinions of human experts, against 56% for humans.
When Chatgpt was invited to create new test questions, human assessors said that these efforts have resisted original tests in terms of equivalent difficulty and to clean up the perception that they did not paraphrasse the original questions. The correlation between the tests generated by AI and the original was described as “strong”, with a correlation coefficient of 0.46 (where 1.0 refers to a perfect correlation and 0 refers to any correlation).
The global conclusion was that AI is better to “understand” the emotions that we.
Deeper history
When live science has consulted several experts, a common theme in their responses was to keep the methodology firmly in mind. Each of the current EI tests used was a multiple choice – barely applicable to the real world scenarios in which tensions between people are high, they stressed.
“It should be noted that humans do not always agree on what someone else feels, and even psychologists can interpret emotional signals differently,” said the financial industry and the information security expert Taimur ijlal. “So” beat “a human on a test like this does not necessarily mean that AI has a deeper idea. This means that he gave the statistically expected answer more often. ”
The capacity tested by the study is not emotional intelligence but something else, they added. “AI systems are excellent for the recognition of models, especially when emotional clues follow a structure recognizable such as facial expressions or linguistic signals,” said Nauman Jaffar, founder and CEO of cliniscripts – a documentation tool powered by AI built for mental health professionals. “But to assimilate this to a deeper” understanding “of the risks of human emotion overvalue what AI does.”
In relation: People find AI more compassionate than mental health experts, according to studies. What could that mean for future advice?
Quizs in structured and quantitative environments – rather than an appreciation of the deeper shade required by a real emotional understanding – is the place where AI shines, and some experts have highlighted a crucial point: that AI works better on tests on emotional situations and not in time – the way humans experience them.
Jason Hennessey, founder and CEO of Digital Hennessy – who has spent years analyzing how the research and language language language language language assimilates the study to the Read the test of the mind in the eyes. It is a common tool to assess the emotional state of a subject and an AI has shown promise But as Hennessey said, when variables as routine as lighting in the photo or cultural context change in such tests, “the precision of the AI drops off a cliff.”
Overall, most experts have found that the assertion “understands” the emotions better than humans that humans were a bit stretch.
“Are LLM useful to categorize common emotional reactions?” said Wyatt Mayham, founder of Northwest IT Consulting. “Of course. But it’s like saying that someone is a big therapist because they have marked well on a Buzzfeed quiz on the emotional theme.”
But there is a final warning, with evidence that even if AI uses the recognition of models rather than a real emotional understanding, it has surpassed humans to identify and respond to emotional states in at least an example.
Aílton, a conversational AI used by more than 6,000 long-haul truck drivers in Brazil, is a multimodal assistant WhatsApp who used the voice, the text and the images, and its developer, the CEO and the chief scientist of Marcos Alves at At Hal-aSaid thatílton identifies stress, anger or sadness with an accuracy of around 80% – about 20 points above its human counterparts, all in the context in emotional situations while the engines interact with it in real time.
In one case, Aílton responded quickly and appropriately when a driver sent a releasing voice note of 15 seconds after a deadly accident of a colleague, responding with nuanced condolences, offering mental health resources and automatically alerting fleet managers.
“Yes, multiple choice of text vignettes simplify the recognition of emotions,” said Alves. “Real empathy is continuous and multimodal. But insulating the cognitive layer is useful. It reveals whether an LLM can identify emotional clues before adding situational noise. ”
He added the ability of LLM to absorb billions of sentences and thousands of hours of conversational audio means that it can code micro-inconation clues that humans often lack. “The configuration of the laboratory is limited,” he said about the study, “but our WhatsApp data confirm that modern LLMs detect and respond better than most people, offering large-scale evolving empathy.”