Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Visual and auditory brain areas share a neural code for perceived emotion

View ORCID ProfileBeau Sievers, Carolyn Parkinson, Peter J. Kohler, James Hughes, Sergey V. Fogelson, Thalia Wheatley
doi: https://doi.org/10.1101/254961
Beau Sievers
1Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH 03755
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Beau Sievers
Carolyn Parkinson
2Department of Psychology, University of California Los Angeles, Los Angeles, CA 90095
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Peter J. Kohler
3Department of Psychology, Stanford University, Stanford, CA 94305
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
James Hughes
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sergey V. Fogelson
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Thalia Wheatley
1Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH 03755
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Supplementary material
  • Preview PDF
Loading

Abstract

Crossmodal redundancy increases both the speed and accuracy of communication (Evans & Treisman, 2011). For example, rattlesnakes hold their tail aloft when rattling, ensuring that it is both seen and heard. This combined audio-visual display is harder to miss or misinterpret than either movement or rattling alone. Perceivers’ brains must be sensitive to such crossmodal redundancies in order to take advantage of them. One possible adaptation for this purpose is the use of a single neural code shared by both auditory and visual information. To test for such a shared neural code, we created emotionally expressive animation and music stimuli that were precisely matched on all of their dynamic features. Participants viewed these stimuli during fMRI brain scanning. Using representational similarity analysis (Kriegeskorte & Kievit, 2013), we show that a single model of stimulus features and emotion content fits activity in both auditory and visual brain areas. This code is also used supramodally in posterior superior temporal cortex, and is used to represent both prototypical and mixed emotions (e.g., Happy-Sad). Exploratory analysis revealed that stimulus features and emotion content are represented in unimodal areas even when stimuli are presented in the area’s non-preferred modality. This evidence for a shared neural code is consistent with adaptive signaling accounts of emotion perception, in which perceivers specifically adapted to perceive crossmodal redundancy accrue an evolutionary advantage.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY 4.0 International license.
Back to top
PreviousNext
Posted July 11, 2018.
Download PDF

Supplementary Material

Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Visual and auditory brain areas share a neural code for perceived emotion
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Visual and auditory brain areas share a neural code for perceived emotion
Beau Sievers, Carolyn Parkinson, Peter J. Kohler, James Hughes, Sergey V. Fogelson, Thalia Wheatley
bioRxiv 254961; doi: https://doi.org/10.1101/254961
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
Visual and auditory brain areas share a neural code for perceived emotion
Beau Sievers, Carolyn Parkinson, Peter J. Kohler, James Hughes, Sergey V. Fogelson, Thalia Wheatley
bioRxiv 254961; doi: https://doi.org/10.1101/254961

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (4230)
  • Biochemistry (9123)
  • Bioengineering (6766)
  • Bioinformatics (23968)
  • Biophysics (12109)
  • Cancer Biology (9510)
  • Cell Biology (13753)
  • Clinical Trials (138)
  • Developmental Biology (7623)
  • Ecology (11674)
  • Epidemiology (2066)
  • Evolutionary Biology (15492)
  • Genetics (10631)
  • Genomics (14310)
  • Immunology (9473)
  • Microbiology (22822)
  • Molecular Biology (9086)
  • Neuroscience (48919)
  • Paleontology (355)
  • Pathology (1480)
  • Pharmacology and Toxicology (2566)
  • Physiology (3840)
  • Plant Biology (8322)
  • Scientific Communication and Education (1468)
  • Synthetic Biology (2295)
  • Systems Biology (6180)
  • Zoology (1299)