RT Journal Article SR Electronic T1 Visual and auditory brain areas share a neural code for perceived emotion JF bioRxiv FD Cold Spring Harbor Laboratory SP 254961 DO 10.1101/254961 A1 Beau Sievers A1 Carolyn Parkinson A1 Peter J. Kohler A1 James Hughes A1 Sergey V. Fogelson A1 Thalia Wheatley YR 2019 UL http://biorxiv.org/content/early/2019/11/18/254961.abstract AB Emotional music and movement are human universals. Further, music and movement are subjectively linked: it is hard to imagine one without the other. One possible reason for the fundamental link between music and movement is that they are represented the same way in the brain, using a shared neural code. To test this, we created emotional music and animation stimuli that were precisely matched on all time-varying structural features. Participants viewed these stimuli while undergoing fMRI of the brain. Using representational similarity analysis (Kriegeskorte & Kievit, 2013), we show that a single model of stimulus features and emotion content fit activity in both auditory and visual brain areas, providing evidence that these regions share a neural code. Further, this code was used in posterior superior temporal cortex during both audition and vision. Across all regions, the shared code represented both prototypical and mixed emotions (e.g., Happy–Sad). Finally, exploratory analysis revealed that stimulus features and emotion content were represented in early visual areas even when stimuli were presented auditorily. This evidence for a shared neural code is consistent with an adaptive signaling account of emotion perception, where perceivers specifically adapted to perceive cross-sensory redundancy accrue an evolutionary advantage.