PT - JOURNAL ARTICLE AU - Yaelan Jung AU - Bart Larsen AU - Dirk B. Walther TI - Modality-independent coding of scene categories in prefrontal cortex AID - 10.1101/142562 DP - 2018 Jan 01 TA - bioRxiv PG - 142562 4099 - http://biorxiv.org/content/early/2018/02/20/142562.short 4100 - http://biorxiv.org/content/early/2018/02/20/142562.full AB - Natural environments convey information through multiple sensory modalities, all of which contribute to people’s percepts. Although it has been shown that visual or auditory content of scene categories can be decoded from brain activity, it remains unclear where and how humans integrate different sensory inputs and represent scene information beyond a specific sensory modality domain. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. A mixed gender group of healthy human subjects participated the present study, where their brain activity was measured with fMRI while viewing images or listening to sounds of different places. We found that both visual and auditory scene categories can be decoded not only from modality-specific areas, but also from several brain regions in the temporal, parietal, and prefrontal cortex. Intriguingly, only in the prefrontal cortex, but not in any other regions, categories of scene images and sounds appear to be represented in similar activation patterns, suggesting that scene representations in the prefrontal cortex are modality-independent. Furthermore, the error patterns of neural decoders indicate that category-specific neural activity patterns in the middle and superior frontal gyri are tightly linked to categorization behavior. Our findings demonstrate that complex scene information is represented at an abstract level in the prefrontal cortex, regardless of the sensory modality of the stimulus.Statement of Significance Our experience in daily life requires the integration of multiple sensory inputs such as images, sounds, or scents from the environment. Here, for the first time, we investigated where and how in the brain information about the natural environment from multiple senses is merged to form modality-independent representations of scene categories. We show direct decoding of scene categories across sensory modalities from patterns of neural activity in the prefrontal cortex. We also conclusively tie these neural representations to human categorization behavior based on the errors from the neural decoder and behavior. Our findings suggest that the prefrontal cortex is a central hub for integrating sensory information and computing modality-independent representations of scene categories.We thank Michael Mack and Heeyoung Choo for their helpful comments on the early version of this manuscript. This work is supported by NSERC Discovery Grant (#498390) and Canadian Foundation for Innovation (#32896).