Summary
Minimizing spatial uncertainty is essential for navigation but the neural mechanisms remain elusive. First, we show that polarising cues produce an anisotropy in the information available to movement trajectories. Secondly, we simulate entorhinal grid cells in an environment with anisotropic information and show that self-location is decoded best when grid-patterns are aligned with the axis of greatest information. Thirdly, we expose human participants to polarised virtual reality environments and confirm the predicted anisotropy in navigation performance and eye movements. Finally, using fMRI we find that the orientation of grid-like hexadirectional activity in entorhinal cortex is aligned with the environmental axis of greatest information; and that this alignment predicted the anisotropy of participants’ spatial memory. In sum, we demonstrate a crucial role of the entorhinal grid system in reducing uncertainty in the neural representation of self-location and find evidence for adaptive spatial computations underlying entorhinal representations in service of optimising behaviour.