Course control is critical for the acquisition of spatial information during exploration and navigation, and it is thought to rely on neural circuits that process locomotive-related multimodal signals. However, which circuits underlie this control, and how multimodal information contributes to the control system are questions poorly understood. We used Virtual Reality to examine the role of self-generated visual signals (visual feedback) on the control of exploratory walking in flies. Exploratory flies display two distinct motor contexts, characterized by low speed and fast rotations, or by high speed and slow rotations, respectively. Flies use visual feedback to control body rotations, but in a motor-context specific manner, primarily when walking at high speed.Different populations of visual motion-sensitive cells estimate body rotations via congruent, multimodal inputs, and drive compensatory rotations. However, their effective contribution to course control is dynamically tuned by a speed-related signal. Our data identifies visual networks with a multimodal circuit mechanism for adaptive course control and suggests models for how visual feedback is combined with internal signals to guide exploratory course control.