↓ Skip to main content

PLOS

Seeing the Song: Left Auditory Structures May Track Auditory-Visual Dynamic Alignment

Overview of attention for article published in PLOS ONE, October 2013
Altmetric Badge

Mentioned by

news
4 news outlets
blogs
2 blogs
twitter
7 X users
googleplus
4 Google+ users

Citations

dimensions_citation
4 Dimensions

Readers on

mendeley
40 Mendeley
Title
Seeing the Song: Left Auditory Structures May Track Auditory-Visual Dynamic Alignment
Published in
PLOS ONE, October 2013
DOI 10.1371/journal.pone.0077201
Pubmed ID
Authors

Julia A. Mossbridge, Marcia Grabowecky, Satoru Suzuki

Abstract

Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization) across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR) was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment.

X Demographics

X Demographics

The data shown below were collected from the profiles of 7 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 40 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 5%
Japan 1 3%
Unknown 37 93%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 13 33%
Researcher 8 20%
Student > Master 4 10%
Student > Bachelor 2 5%
Professor 2 5%
Other 3 8%
Unknown 8 20%
Readers by discipline Count As %
Neuroscience 9 23%
Engineering 5 13%
Psychology 4 10%
Agricultural and Biological Sciences 4 10%
Medicine and Dentistry 4 10%
Other 5 13%
Unknown 9 23%