RT Journal Article SR Electronic T1 Motor context coordinates visually guided walking in Drosophila JF bioRxiv FD Cold Spring Harbor Laboratory SP 572792 DO 10.1101/572792 A1 Cruz, Tomás A1 Fujiwara, Terufumi A1 Varela, Nélia A1 Mohammad, Farhan A1 Claridge-Chang, Adam A1 Chiappe, M Eugenia YR 2019 UL http://biorxiv.org/content/early/2019/03/11/572792.abstract AB Course control is critical for the acquisition of spatial information during exploration and navigation, and it is thought to rely on neural circuits that process locomotive-related multimodal signals. However, which circuits underlie this control, and how multimodal information contributes to the control system are questions poorly understood. We used Virtual Reality to examine the role of self-generated visual signals (visual feedback) on the control of exploratory walking in flies. Exploratory flies display two distinct motor contexts, characterized by low speed and fast rotations, or by high speed and slow rotations, respectively. Flies use visual feedback to control body rotations, but in a motor-context specific manner, primarily when walking at high speed. Different populations of visual motion-sensitive cells estimate body rotations via congruent, multimodal inputs, and drive compensatory rotations. However, their effective contribution to course control is dynamically tuned by a speed-related signal. Our data identifies visual networks with a multimodal circuit mechanism for adaptive course control and suggests models for how visual feedback is combined with internal signals to guide exploratory course control.