Three stages of facial expression processing: ERP study with rapid serial visual presentation

Neuroimage. 2010 Jan 15;49(2):1857-67. doi: 10.1016/j.neuroimage.2009.09.018. Epub 2009 Sep 18.

Abstract

Electrophysiological correlates of the processing facial expressions were investigated in subjects performing the rapid serial visual presentation (RSVP) task. The peak latencies of the event-related potential (ERP) components P1, vertex positive potential (VPP), and N170 were 165, 240, and 240 ms, respectively. The early anterior N100 and posterior P1 amplitudes elicited by fearful faces were larger than those elicited by happy or neutral faces, a finding which is consistent with the presence of a 'negativity bias.' The amplitude of the anterior VPP was larger when subjects were processing fearful and happy faces than when they were processing neutral faces; it was similar in response to fearful and happy faces. The late N300 and P300 not only distinguished emotional faces from neutral faces but also differentiated between fearful and happy expressions in lag2. The amplitudes of the N100, VPP, N170, N300, and P300 components and the latency of the P1 component were modulated by attentional resources. Deficient attentional resources resulted in decreased amplitude and increased latency of ERP components. In light of these results, we present a hypothetical model involving three stages of facial expression processing.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Brain / physiology*
  • Emotions*
  • Event-Related Potentials, P300
  • Evoked Potentials
  • Facial Expression*
  • Fear
  • Female
  • Happiness
  • Humans
  • Male
  • Models, Neurological
  • Neuropsychological Tests
  • Photic Stimulation
  • Task Performance and Analysis
  • Time Factors
  • Visual Perception / physiology*
  • Young Adult