“I’m Sorry, Dave, I Can’t Do That” –AI Still Lags Humans in Reading Emotions

HAL 9000


Said the AI empowered HAL 9000 near the end of Space Odyssey 2001 in one of most chilling exchanges in movie history, after HAL begins killing off the ship’s crew. Dave Bowman takes a small space pod outside to retrieve the body of fellow astronaut Frank Poole. As Bowman returns to the ship, he asks HAL to let him back inside with the famous line: “Open the pod bay doors, HAL.”

Fast forward to today: When it comes to reading emotions on people’s faces, artificial intelligence still lags behind human observers, according to a new study. The difference was particularly pronounced when it came to spontaneous displays of emotion, according to the findings published in PLOS One.

The research team, led by Dublin City University, looked at eight “out of the box” automatic classifiers for facial affect recognition (artificial intelligence that can identify human emotions on faces) and compared their emotion recognition performance to that of human observers.

The researchers found found that the human recognition accuracy of emotions was 72% whereas among the artificial intelligence tested, the researchers observed a variance in recognition accuracy, ranging from 48% to 62%.

“AI systems claiming to recognize humans’ emotions from their facial expressions are now very easy to develop, said lead author Dr. Damien Dupré (Dublin City University). “However, most of them are based on inconclusive scientific evidence that people are expressing emotions in the same way. For these systems, human emotions come down to only six basic emotions, but they do not cope well with blended emotions.

Could Artificial Intelligence be Billions of Years Old?

“Companies using such systems need to be aware that the results obtained are not a measure of the emotion felt, but merely a measure of how much one’s face matches with a face supposed to correspond to one of these six emotions.”

The study involved 937 videos sampled from two large databases that conveyed the basic six emotions (happiness, sadness, anger, fear, surprise, and disgust). Two well-known dynamic facial expression databases were chosen: BU-4DFE from Binghamton University in New York and the other from The University of Texas in Dallas. Both are annotated in terms of emotion categories, and contain either posed or spontaneous facial expressions. All of the examined expressions were dynamic to reflect the realistic nature of human facial behavior.

Classification accuracy for AI was consistently lower for spontaneous affective behavior, but the gap narrowed for posed expressions. The two best AI systems were similarly adept to people at identifying posed expressions.

To evaluate the accuracy of emotion recognition, the study compared the performance achieved by human judges with those of eight commercially available automatic classifiers.

“Artificial Intelligence of the Future Could Reveal the Incomprehensible”

“AI has come a long way in identifying people’s facial expressions, but our research suggests that there is still room for improvement in recognizing genuine human emotions,” said co-author Dr. Eva Krumhuber (UCL Psychology & Language Sciences).

The PLOS One study was conducted by researchers at Dublin City University, University College London, University of Bremen and Queen’s University Belfast.

Dr. Krumhuber recently led a separate study, published in Emotion comparing human and machines in emotion recognition across fourteen different databases of dynamic facial expressions. The smaller study, which used a different method to analyze the machine data, found that AI was comparable to humans at recognizing emotions.

Source:: Damien Dupré et al. A performance comparison of eight commercially available automatic classifiers for facial affect recognition, PLOS ONE (2020). DOI: 10.1371/journal.pone.0231968

The Daily Galaxy, Max Goldberg, via University College London

Leave a Reply

Your email address will not be published. Required fields are marked *