RT Journal Article SR Electronic T1 Variability in training unlocks generalization in visual perceptual learning through invariant representations JF bioRxiv FD Cold Spring Harbor Laboratory SP 2022.08.26.505408 DO 10.1101/2022.08.26.505408 A1 Manenti, Giorgio L. A1 Dizaji, Aslan Satary A1 Schwiedrzik, Caspar M. YR 2022 UL http://biorxiv.org/content/early/2022/11/22/2022.08.26.505408.abstract AB Stimulus and location specificity are long considered hallmarks of visual perceptual learning. This renders visual perceptual learning distinct from other forms of learning, where generalization can be more easily attained, and unsuitable for practical applications, where generalization is key. Based on hypotheses derived from the structure of the visual system, we test here whether stimulus variability can unlock generalization in perceptual learning. We train subjects in orientation discrimination, while we vary the amount of variability in a task-irrelevant feature, spatial frequency. We find that independently of task difficulty, this manipulation enables generalization of learning to new stimuli and locations, while not negatively affecting the overall amount of learning on the task. We then use deep neural networks to investigate how variability unlocks generalization. We find that networks develop invariance to the task-irrelevant feature when trained with variable inputs. The degree of learned invariance strongly predicts generalization. A reliance on invariant representations can explain variability-induced generalization in visual perceptual learning, suggests new targets for understanding the neural basis of perceptual learning in high-order visual cortex, and presents an easy to implement modification of common training paradigms that may benefit practical applications.Competing Interest StatementThe funders had no role in study design, data collection and interpretation, decision to publish, or preparation of the manuscript. ASD is a founder of Neuro-Inspired Vision and a member of its scientific advisory board. GLM and CMS declare no competing financial interests.