↓ Skip to main content

PLOS

Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

Overview of attention for article published in PLOS ONE, January 2012
Altmetric Badge

Mentioned by

twitter
3 X users
googleplus
4 Google+ users

Citations

dimensions_citation
43 Dimensions

Readers on

mendeley
195 Mendeley
citeulike
3 CiteULike
Title
Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces
Published in
PLOS ONE, January 2012
DOI 10.1371/journal.pone.0030740
Pubmed ID
Authors

Simon Rigoulot, Marc D. Pell

Abstract

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 195 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
France 2 1%
United Kingdom 2 1%
Switzerland 1 <1%
Austria 1 <1%
Spain 1 <1%
Japan 1 <1%
United States 1 <1%
Philippines 1 <1%
Unknown 185 95%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 44 23%
Student > Master 31 16%
Researcher 24 12%
Student > Bachelor 15 8%
Student > Postgraduate 13 7%
Other 41 21%
Unknown 27 14%
Readers by discipline Count As %
Psychology 81 42%
Linguistics 15 8%
Neuroscience 14 7%
Agricultural and Biological Sciences 10 5%
Medicine and Dentistry 9 5%
Other 29 15%
Unknown 37 19%