Humans adapt their anticipatory eye movements to the volatility of visual motion properties

PLoS Comput Biol. 2020 Apr 13;16(4):e1007438. doi: 10.1371/journal.pcbi.1007438. eCollection 2020 Apr.

Abstract

Animal behavior constantly adapts to changes, for example when the statistical properties of the environment change unexpectedly. For an agent that interacts with this volatile setting, it is important to react accurately and as quickly as possible. It has already been shown that when a random sequence of motion ramps of a visual target is biased to one direction (e.g. right or left), human observers adapt their eye movements to accurately anticipate the target's expected direction. Here, we prove that this ability extends to a volatile environment where the probability bias could change at random switching times. In addition, we also recorded the explicit prediction of the next outcome as reported by observers using a rating scale. Both results were compared to the estimates of a probabilistic agent that is optimal in relation to the assumed generative model. Compared to the classical leaky integrator model, we found a better match between our probabilistic agent and the behavioral responses, both for the anticipatory eye movements and the explicit task. Furthermore, by controlling the level of preference between exploitation and exploration in the model, we were able to fit for each individual's experimental dataset the most likely level of volatility and analyze inter-individual variability across participants. These results prove that in such an unstable environment, human observers can still represent an internal belief about the environmental contingencies, and use this representation both for sensory-motor control and for explicit judgments. This work offers an innovative approach to more generically test the diversity of human cognitive abilities in uncertain and dynamic environments.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Algorithms
  • Bayes Theorem
  • Cognition
  • Eye Movements*
  • Female
  • Humans
  • Male
  • Models, Statistical
  • Motion Perception*
  • Phenotype
  • Probability
  • Proportional Hazards Models
  • Psychomotor Performance
  • Pursuit, Smooth
  • Reaction Time
  • Vision, Ocular*
  • Young Adult

Grants and funding

This work was supported by EU Marie-Sklodowska-Curie Grant No 642961 (PACE-ITN / A. Montagnini and L. Perrinet as participants) and by the Fondation pour le Recherche Médicale, under the program Équipe FRM (DEQ20180339203/PredictEye/PI: G Masson/ A. Montagnini and L. Perrinet as participants). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.