Convergent and invariant object representations for sight, sound, and touch
Citations Over TimeTop 10% of 2015 papers
Abstract
Abstract We continuously perceive objects in the world through multiple sensory channels. In this study, we investigated the convergence of information from different sensory streams within the cerebral cortex. We presented volunteers with three common objects via three different modalities—sight, sound, and touch—and used multivariate pattern analysis of functional magnetic resonance imaging data to map the cortical regions containing information about the identity of the objects. We could reliably predict which of the three stimuli a subject had seen, heard, or touched from the pattern of neural activity in the corresponding early sensory cortices. Intramodal classification was also successful in large portions of the cerebral cortex beyond the primary areas, with multiple regions showing convergence of information from two or all three modalities. Using crossmodal classification, we also searched for brain regions that would represent objects in a similar fashion across different modalities of presentation. We trained a classifier to distinguish objects presented in one modality and then tested it on the same objects presented in a different modality. We detected audiovisual invariance in the right temporo‐occipital junction, audiotactile invariance in the left postcentral gyrus and parietal operculum, and visuotactile invariance in the right postcentral and supramarginal gyri. Our maps of multisensory convergence and crossmodal generalization reveal the underlying organization of the association cortices, and may be related to the neural basis for mental concepts. Hum Brain Mapp 36:3629–3640, 2015 . © 2015 Wiley Periodicals, Inc.
Related Papers
- → Attention and the crossmodal construction of space(1998)399 cited
- → Short parietal lobe connections of the human and monkey brain(2017)109 cited
- → Grapheme-color synesthetes show enhanced crossmodal processing between auditory and visual modalities(2011)92 cited
- → Simple and complex crossmodal correspondences involving audition(2020)32 cited
- → Exploring the Links between Sensation & Perception(2021)