Abstract
While primates are primarily visual animals, how visual information is processed on its way to memory structures and contributes to the generation of visuospatial behaviors is poorly understood. Recent imaging data demonstrate the existence of scene-sensitive areas in the dorsal visual path that are likely to combine visual information from successive egocentric views, while behavioral evidence indicates the memory of surrounding visual space in extraretinal coordinates. The present work focuses on the computational nature of a panoramic representation that is proposed to link visual and mnemonic functions during natural behavior. In a spiking neural network model of the dorsal visual path it is shown how time-integration of spatial views can give rise to such a representation and how it can subsequently be used to perform memory-based spatial reorientation and visual search. More generally, the model predicts a common role of view-based allocentric memory storage in spatial and non-spatial mnemonic behaviors.
Footnotes
Discussion has been rewritten to highlight the importance of presented findings and clarify experimental predictions derived from them.