RT Journal Article SR Electronic T1 Short-term effects of sound localization training in virtual reality JF bioRxiv FD Cold Spring Harbor Laboratory SP 207753 DO 10.1101/207753 A1 Mark A. Steadman A1 Chungeun Kim A1 Jean-Hugues Lestang A1 Dan F. M. Goodman A1 Lorenzo Picinali YR 2019 UL http://biorxiv.org/content/early/2019/03/19/207753.abstract AB Head-related transfer functions (HRTFs) capture the direction-dependant way that sound interacts the head and torso. In virtual audio systems, which aim to emulate these effects, non-individualized, generic HRTFs are typically used, leading to inaccurate virtual sound localization. Training has the potential to exploit the brain’s ability to adapt to these unfamiliar cues. In this study, three virtual sound localization training paradigms were evaluated; one provided simple visual positional confirmation of sound source location, a second introduced game design elements (“gamification”) and a final version additionally utilized head-tracking to provide listeners with experience of relative sound source motion (“active listening”). The results demonstrate a significant effect of training after a small number of short (12-minute) training sessions, which is retained across multiple days. Gamification alone had no significant effect on the efficacy of the training, but the inclusion of active listening resulted in a significantly greater improvement in virtual sound localization accuracy. Improvements in polar angle judgement were significantly larger for the trained HRTFs, while improvement in lateral judgements and front-back reversals also generalized to a second set of HRTFs, for which no positional feedback was given. The implications of this on the putative mechanisms of the adaptation process are discussed.