Abstract
Current neurobiological models assign a central role to predictive processes calibrated to environmental statistics. Neuroimaging studies examining the encoding of stimulus uncertainty have relied almost exclusively on manipulations in which stimuli were presented in a single sensory modality, and further assumed that neural responses vary monotonically with uncertainty. This has left a gap in theoretical development with respect to two core issues: i) are there cross-modal brain systems that encode input uncertainty in way that generalizes across sensory modalities, and ii) are there brain systems that track input uncertainty in a non-monotonic fashion? We used multivariate pattern analysis to address these two issues using auditory, visual and audiovisual inputs. We found signatures of cross-modal encoding in frontoparietal, orbitofrontal, and association cortices using a searchlight cross-classification analysis where classifiers trained to discriminate levels of uncertainty in one modality were tested in another modality. Additionally, we found widespread systems encoding uncertainty non-monotonically using classifiers trained to discriminate intermediate levels of uncertainty from both the highest and lowest uncertainty levels. These findings comprise the first comprehensive report of cross-modal and non-monotonic neural sensitivity to statistical regularities in the environment, and suggest that conventional paradigms testing for monotonic responses to uncertainty in a single sensory modality may have limited generalizability.