TY - JOUR T1 - Hierarchical temporal prediction captures motion processing from retina to higher visual cortex JF - bioRxiv DO - 10.1101/575464 SP - 575464 AU - Yosef Singer AU - Ben D. B. Willmore AU - Andrew J. King AU - Nicol S. Harper Y1 - 2019/01/01 UR - http://biorxiv.org/content/early/2019/03/21/575464.abstract N2 - Visual neurons respond selectively to features that become increasingly complex in their form and dynamics from the eyes to the cortex. These features take specific forms: retinal neurons prefer localized flashing dots1, primary visual cortical (V1) neurons moving bars2–4, and those in higher cortical areas, such as middle temporal (MT) cortex, favor complex features like moving textures5–7. Whether there are general principles behind this diverse complexity of response properties in the visual system has been an area of intense investigation. To date, no single normative model has been able to account for the hierarchy of tuning to dynamic inputs along the visual pathway. Here we show that hierarchical temporal prediction - representing features that efficiently predict future sensory input from past sensory input8–11 - can explain how neuronal tuning properties, particularly those relating to motion, change from retina to higher visual cortex. In contrast to some other approaches12–16, the temporal prediction framework learns to represent features of unlabeled and dynamic stimuli, an essential requirement of the real brain. This suggests that the brain may not have evolved to efficiently represent all incoming stimuli, as implied by some leading theories. Instead, the selective representation of sensory features that help in predicting the future may be a general coding principle for extracting temporally-structured features that depend on increasingly high-level statistics of the visual input. ER -