PT - JOURNAL ARTICLE AU - Tomás Cruz AU - Terufumi Fujiwara AU - Nélia Varela AU - Farhan Mohammad AU - Adam Claridge-Chang AU - M Eugenia Chiappe TI - Motor context coordinates visually guided walking in <em>Drosophila</em> AID - 10.1101/572792 DP - 2019 Jan 01 TA - bioRxiv PG - 572792 4099 - http://biorxiv.org/content/early/2019/03/11/572792.short 4100 - http://biorxiv.org/content/early/2019/03/11/572792.full AB - Course control is critical for the acquisition of spatial information during exploration and navigation, and it is thought to rely on neural circuits that process locomotive-related multimodal signals. However, which circuits underlie this control, and how multimodal information contributes to the control system are questions poorly understood. We used Virtual Reality to examine the role of self-generated visual signals (visual feedback) on the control of exploratory walking in flies. Exploratory flies display two distinct motor contexts, characterized by low speed and fast rotations, or by high speed and slow rotations, respectively. Flies use visual feedback to control body rotations, but in a motor-context specific manner, primarily when walking at high speed. Different populations of visual motion-sensitive cells estimate body rotations via congruent, multimodal inputs, and drive compensatory rotations. However, their effective contribution to course control is dynamically tuned by a speed-related signal. Our data identifies visual networks with a multimodal circuit mechanism for adaptive course control and suggests models for how visual feedback is combined with internal signals to guide exploratory course control.