Abstract
The lateral intraparietal cortex (LIP) contributes to visuomotor transformations for determining where to look next. However, its spatial selectivity can signify attentional priority, motor planning, perceptual discrimination, or other mechanisms. Resolving how this LIP signal influences a perceptually guided choice requires knowing exactly when such signal arises and when the perceptual evaluation informs behavior. To achieve this, we recorded single-neuron activity while monkeys performed an urgent choice task for which the perceptual evaluation’s progress can be tracked millisecond by millisecond. The evoked presaccadic responses were strong, exhibited modest motor preference, and were only weakly modulated by sensory evidence. This modulation was remarkable, though, in that its time course preceded and paralleled that of behavioral performance (choice accuracy), and it closely resembled the statistical definition of confidence. The results indicate that, as the choice process unfolds, LIP dynamically combines attentional, motor, and perceptual signals, the former being much stronger than the latter.
Footnotes
Abbreviations: CS, compelled-saccade; FEF, frontal eye field; LIP, lateral intraparietal; RF, response field; rPT, raw processing time; RT, reaction time; SC, superior colliculus; SEM, standard error of the mean; Sin, saccade in; Sout, saccade out; Tin, target in; Tout, target out
Grants: Research was supported by the National Institutes of Health through grants R01EY021228, R01EY025172, and F31EY029154 from the NEI; grant R01DA030750 from NIDA as part of the NSF/NIH Collaborative Research in Computational Neuroscience (CRCNS) Program; and by the NIH-NINDS Training Grant T32NS073553.
Disclosures: The authors declare no conflicts of interest, financial or otherwise.