Predicting vocal emotion expressions from the human brain

Hum Brain Mapp. 2013 Aug;34(8):1971-81. doi: 10.1002/hbm.22041. Epub 2012 Feb 27.

Abstract

Speech is an important carrier of emotional information. However, little is known about how different vocal emotion expressions are recognized in a receiver's brain. We used multivariate pattern analysis of functional magnetic resonance imaging data to investigate to which degree distinct vocal emotion expressions are represented in the receiver's local brain activity patterns. Specific vocal emotion expressions are encoded in a right fronto-operculo-temporal network involving temporal regions known to subserve suprasegmental acoustic processes and a fronto-opercular region known to support emotional evaluation, and, moreover, in left temporo-cerebellar regions covering sequential processes. The right inferior frontal region, in particular, was found to differentiate distinct emotional expressions. The present analysis reveals vocal emotion to be encoded in a shared cortical network reflected by distinct brain activity patterns. These results shed new light on theoretical and empirical controversies about the perception of distinct vocal emotion expressions at the level of large-scale human brain signals.

Keywords: emotion; fMRI; multivariate pattern analysis; prosody; vocal expressions.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Auditory Perception / physiology*
  • Brain / physiology*
  • Brain Mapping*
  • Emotions / physiology*
  • Female
  • Humans
  • Image Interpretation, Computer-Assisted
  • Magnetic Resonance Imaging
  • Male
  • Speech
  • Support Vector Machine
  • Voice*
  • Young Adult