RT Journal Article SR Electronic T1 Modeling human eye movements during immersive visual search JF bioRxiv FD Cold Spring Harbor Laboratory SP 2022.12.01.518717 DO 10.1101/2022.12.01.518717 A1 Radulescu, Angela A1 van Opheusden, Bas A1 Callaway, Frederick A1 Griffiths, Thomas L. A1 Hillis, James M. YR 2022 UL http://biorxiv.org/content/early/2022/12/01/2022.12.01.518717.abstract AB The nature of eye movements during visual search has been widely studied in psychology and neuroscience. Virtual reality (VR) paradigms provide an opportunity to test whether computational models of search can predict naturalistic search behavior. However, existing ideal observer models are constrained by strong assumptions about the structure of the world, rendering them impractical for modeling the complexity of environments that can be studied in VR. To address these limitations, we frame naturalistic visual search as a problem of allocating limited cognitive resources, formalized as a meta-level Markov decision process (meta-MDP) over a representation of the environment encoded by a deep neural network. We train reinforcement learning agents to solve the meta-MDP, showing that the agents’ optimal policy converges to a classic ideal observer model of search developed for simplified environments. We compare the learned policy with human gaze data from a visual search experiment conducted in VR, finding a qualitative and quantitative correspondence between model predictions and human behavior. Our results suggest that gaze behavior in naturalistic visual search is consistent with rational allocation of limited cognitive resources.Competing Interest StatementThe authors have declared no competing interest.