↓ Skip to main content

PLOS

Direction Specific Biases in Human Visual and Vestibular Heading Perception

Overview of attention for article published in PLOS ONE, December 2012
Altmetric Badge

Mentioned by

twitter
1 X user
facebook
1 Facebook page

Readers on

mendeley
51 Mendeley
Title
Direction Specific Biases in Human Visual and Vestibular Heading Perception
Published in
PLOS ONE, December 2012
DOI 10.1371/journal.pone.0051383
Pubmed ID
Authors

Benjamin T. Crane

Abstract

Heading direction is determined from visual and vestibular cues. Both sensory modalities have been shown to have better direction discrimination for headings near straight ahead. Previous studies of visual heading estimation have not used the full range of stimuli, and vestibular heading estimation has not previously been reported. The current experiments measure human heading estimation in the horizontal plane to vestibular, visual, and spoken stimuli. The vestibular and visual tasks involved 16 cm of platform or visual motion. The spoken stimulus was a voice command speaking a heading angle. All conditions demonstrated direction dependent biases in perceived headings such that biases increased with headings further from the fore-aft axis. The bias was larger with the visual stimulus when compared with the vestibular stimulus in all 10 subjects. For the visual and vestibular tasks precision was best for headings near fore-aft. The spoken headings had the least bias, and the variation in precision was less dependent on direction. In a separate experiment when headings were limited to ± 45°, the biases were much less, demonstrating the range of headings influences perception. There was a strong and highly significant correlation between the bias curves for visual and spoken stimuli in every subject. The correlation between visual-vestibular and vestibular-spoken biases were weaker but remained significant. The observed biases in both visual and vestibular heading perception qualitatively resembled predictions of a recent population vector decoder model (Gu et al., 2010) based on the known distribution of neuronal sensitivities.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 51 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 2 4%
France 1 2%
Unknown 48 94%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 11 22%
Researcher 11 22%
Student > Master 7 14%
Student > Doctoral Student 6 12%
Student > Bachelor 3 6%
Other 8 16%
Unknown 5 10%
Readers by discipline Count As %
Psychology 17 33%
Neuroscience 7 14%
Medicine and Dentistry 7 14%
Engineering 4 8%
Agricultural and Biological Sciences 3 6%
Other 5 10%
Unknown 8 16%