Abstract
The brain combines information from multiple sensory modalities to build a consistent view of the world. The principles by which multimodal stimuli are integrated in cortical hierarchies are well studied, but it is less clear whether and how unimodal inputs systematically shape the processing of signals carried by a different modality. Here we use a visual classification task in rats to investigate how task-irrelevant sounds modify the processing of visual stimuli. Our data shows that the intensity of a sound, but not its temporal modulation frequency, enacts a powerful, effective compression of the visual perceptual space. This result underscores the importance of cross-modal influences in perceptual pathways and suggests an important role for inhibition as the mediator of auditory-visual interactions at the neural representation level.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
Title, abstract, and body of the main text have been updated