“…Several engineering initiatives have made use of sensory-guided navigation to control autonomous vehicles (Baker et al, 2014;Conte and Doherty, 2008;Smith et al, 2013;Steckel and Peremans, 2017;Strydom et al, 2014) or create devices to help visually impaired individuals move safely within their environment (Filipe et al, 2012;Katzschmann et al, 2018;Lee and Medioni, 2011). While some of these systems use patterns of light, such as optic flow, to process information from the environment (Conte and Doherty, 2008;Strydom et al, 2014), recent work in sonar-based navigation has incorporated acoustic flow cues to automatically steer unmanned vehicles through complex corridors (Baker et al, 2014;Peremans and Steckel, 2014;Smith et al, 2014;Steckel and Peremans, 2017;Vanderelst et al, 2016). Most of the acoustic-based navigation devices have been tested in environments that contain large objects or flat surfaces, and it would be interesting to test the behavior of these systems in environments that create echo flow patterns similar to those presented here.…”