Optimal multisensory decision-making in a reaction-time task

Elife. 2014 Jun 14:3:e03005. doi: 10.7554/eLife.03005.

Abstract

Humans and animals can integrate sensory evidence from various sources to make decisions in a statistically near-optimal manner, provided that the stimulus presentation time is fixed across trials. Little is known about whether optimality is preserved when subjects can choose when to make a decision (reaction-time task), nor when sensory inputs have time-varying reliability. Using a reaction-time version of a visual/vestibular heading discrimination task, we show that behavior is clearly sub-optimal when quantified with traditional optimality metrics that ignore reaction times. We created a computational model that accumulates evidence optimally across both cues and time, and trades off accuracy with decision speed. This model quantitatively explains subjects's choices and reaction times, supporting the hypothesis that subjects do, in fact, accumulate evidence optimally over time and across sensory modalities, even when the reaction time is under the subject's control.

Keywords: cue combination; decision-making; diffusion models; human; neuroscience; reaction time.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Adult
  • Behavior
  • Cues
  • Decision Making*
  • Female
  • Humans
  • Male
  • Motion Perception
  • Neurons / physiology
  • Psychomotor Performance
  • Reaction Time*
  • Visual Perception
  • Young Adult