Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus

J Neurophysiol. 2005 Oct;94(4):2331-52. doi: 10.1152/jn.00021.2005. Epub 2005 Apr 20.

Abstract

The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be modulated by the locations of visual targets, 19% by auditory targets, and 9% by both visual and auditory targets. The reference frame for both visual and auditory receptive fields ranged along a continuum between eye- and head-centered reference frames with approximately 10% of auditory and 33% of visual neurons having receptive fields that were more consistent with an eye- than a head-centered frame of reference and 23 and 18% having receptive fields that were more consistent with a head- than an eye-centered frame of reference, leaving a large fraction of both visual and auditory response patterns inconsistent with both head- and eye-centered reference frames. The results were similar to the reference frame we have previously found for auditory stimuli in the inferior colliculus and core auditory cortex. The correspondence between the visual and auditory receptive fields of individual neurons was weak. Nevertheless, the visual and auditory responses were sufficiently well correlated that a simple one-layer network constructed to calculate target location from the activity of the neurons in our sample performed successfully for auditory targets even though the weights were fit based only on the visual responses. We interpret these results as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion.

Publication types

  • Comparative Study
  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Acoustic Stimulation / methods
  • Action Potentials / physiology
  • Analysis of Variance
  • Animals
  • Brain Mapping
  • Functional Laterality
  • Head Movements / physiology*
  • Macaca mulatta
  • Magnetic Resonance Imaging / methods
  • Memory / physiology
  • Neurons / classification
  • Neurons / physiology*
  • Parietal Lobe / cytology*
  • Parietal Lobe / physiology
  • Photic Stimulation / methods
  • Saccades / physiology*
  • Sound Localization / physiology*
  • Space Perception / physiology*