Strategy

How do you read people's emotions? Scientists point to 2 clues

“The face is not enough to perceive emotion.”

GoodStudio/Shutterstock

Whether you’re planning a project with your manager, on a first date, or playing a game of poker, it’s vital to possess the ability to read others’ emotions. But according to two studies, it’s not nearly as simple as looking for tells in their facial expressions.

“In the presence of both will and skill, people often inaccurately perceive others’ emotions,” said Dr. Michael Kraus of Yale University, whose research was published by the American Psychological Association. “Our research suggests that relying on a combination of vocal and facial cues, or solely facial cues, may not be the best strategy for accurately recognizing emotions or intentions of others.”

So what does he suggest? Listening carefully.

1. The power of listening

To reach his conclusion, Kraus conducted five experiments that involved more than 1,800 American participants who were tasked with either interacting with another person or presented with an interaction between two people. The researcher restricted some participants to only listening and not looking; others were able to look but not listen; and a third group could do both. He found across all five experiments that, on average, those who only listened more accurately identified the emotions experienced by others.

“Listening matters,” he said. “Actually considering what people are saying and the ways in which they say it can, I believe, lead to improved understanding of others at work or in your personal relationships.”

These results challenge the commonly held view that our faces reveal our emotions. Kraus said he believes people have practiced using facial expressions to mask their emotions, and that having more information isn’t always better for accuracy.

“What we find here is that perhaps people are paying too much attention to the face — the voice might have much of the content necessary to perceive others’ internal states accurately,” he said. “The findings suggest that we should be focusing more on studying vocalizations of emotion.”

2. Context matters

Another study conducted by researchers at UC Berkeley also challenges the commonly held belief that our faces reveal our emotions. The background and actions around them matters.

“Our study reveals that emotion recognition is, at its heart, an issue of context as much as it is about faces,” said lead author Zhimin Chen, a doctoral student in psychology at UC Berkeley. “Our research shows that faces don’t reveal true emotions very accurately, and that identifying a person’s frame of mind should take into account context as well.”

In a series of three experiments, the researchers blurred the faces and bodies of actors in dozens of muted movie and home video clips. Study participants went online to view and rate the clips. Researchers who superimposed a rating grid over the videos tracked each study participant’s cursor as it moved around the screen to rate moment-to-moment emotions. In the largest experiment of the three, about 200 people viewed clips under three conditions: everything was visible, characters were blurred, or context was blurred.

Despite people’s faces and bodies being blurred, hundreds of study participants were still able to accurately read characters’ emotions by examining the background and how they interacted with their surroundings.

“Overall, the results suggest that context is not only sufficient to perceive emotion but also necessary to perceive a person’s emotion,” said senior author David Whitney, a UC Berkeley vision scientist. “The face is not enough to perceive emotion.”

Abstract:

Emotion recognition is an essential human ability critical for social functioning. It is widely assumed that identifying facial expression is the key to this, and models of emotion recognition have mainly focused on facial and bodily features in static, unnatural conditions. We developed a method called affective tracking to reveal and quantify the enormous contribution of visual context to affect (valence and arousal) perception. When characters’ faces and bodies were masked in silent videos, viewers inferred the affect of the invisible characters successfully and in high agreement based solely on visual context. We further show that the context is not only sufficient but also necessary to accurately perceive human affect over time, as it provides a substantial and unique contribution beyond the information available from face and body. Our method (which we have made publicly available) reveals that emotion recognition is, at its heart, an issue of context as much as it is about faces.
Related Tags