PT - JOURNAL ARTICLE AU - Jakub Limanowski AU - Karl Friston TI - Active inference under intersensory conflict: Simulation and empirical results AID - 10.1101/795419 DP - 2019 Jan 01 TA - bioRxiv PG - 795419 4099 - http://biorxiv.org/content/early/2019/10/07/795419.short 4100 - http://biorxiv.org/content/early/2019/10/07/795419.full AB - It has been suggested that the brain controls hand movements via internal models that rely on visual and proprioceptive cues about the state of the hand. In active inference formulations of such models, the relative influence of each modality on action and perception is determined by how precise (reliable) it is expected to be. The ‘top-down’ affordance of expected precision to a particular sensory modality presumably corresponds to attention. Here, we asked whether increasing attention to (i.e., the precision of) vision or proprioception would enhance performance in a hand-target phase matching task, in which visual and proprioceptive cues about hand posture were incongruent. We show that in a simple simulated agent—using a neurobiologically informed predictive coding formulation of active inference—increasing either modality’s expected precision improved task performance under visuo-proprioceptive conflict. Moreover, we show that this formulation captured the behaviour and self-reported attentional allocation of human participants performing the same task in a virtual reality environment. Together, our results show that selective attention can balance the impact of (conflicting) visual and proprioceptive cues on action—rendering attention a key mechanism for a flexible body representation for action.Author summary When controlling hand movements, the brain can rely on seen and felt hand position or posture information. It is thought that the brain combines these estimates into a multisensory hand representation in a probabilistic fashion, accounting for how reliable each estimate is in the given context. According to recent formal accounts of action, the expected reliability or ‘precision’ of sensory information can—to an extent—also be influenced by attention. Here, we tested whether this mechanism can improve goal-directed behaviour. We designed a task that required tracking a target’s oscillatory phase with either the seen or the felt hand posture, which were decoupled by introducing a temporal conflict via a virtual reality environment. We first simulated the behaviour of an artificial agent performing this task, and then compared the simulation results to behaviour of human participants performing the same task. Together, our results showed that increasing attention to the seen or felt hand was accompanied by improved target tracking. This suggests that, depending on the current behavioural demands, attention can balance how strongly the multisensory hand representation is relying on visual or proprioceptive sensory information.