Dynamics of visual information integration in the brain for categorizing facial expressions

Curr Biol. 2007 Sep 18;17(18):1580-5. doi: 10.1016/j.cub.2007.08.048.

Abstract

A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with "fear" being faster than "disgust," itself faster than "happy"). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain / physiology*
  • Electroencephalography
  • Evoked Potentials
  • Facial Expression*
  • Female
  • Humans
  • Male
  • Photic Stimulation
  • Visual Perception / physiology*