Among many of their countless dismounted roles, U.S. Air Force Battlefield Airmen must navigate in unfamiliar environments with many potential threats while performing their mission objectives. The effectiveness of navigating with a map and compass is compromised in reduced visibility conditions, such as in fog or during nighttime operations. Moreover, focusing on a map draws attention away from the immediate surroundings and reduces their ability to detect threats. To ameliorate this problem, we prototyped auditory and tactile navigation displays controlled only by a mobile phone using its built-in GPS and compass. The auditory and tactile displays direct users towards their destination with 3D audio and vibrotactile cues, respectively. We evaluated the navigation displays on a waypoint navigation task. Results suggest that auditory and tactile displays can guide users to their destination as effectively as a visual display (i.e., a GPS enabled map). Initial findings justify further development of multimodal navigation displays to increase the efficiency of Battlefield Airmen in land navigation tasks.
BACKGROUND
Waypoint navigation is a critical task for dismounted soldiers, especially when navigating through novel environments with potential threats. In these dangerous environments, the soldiers should have their "eyesup" and "ears-out" scanning the environment for critical signals. Current practices for dismounted soldiers include the use of a compass and map or small wearable computer in order to navigate. In this experiment, we compared several modalities and multiple combinations of these modalities in waypoint navigation performance. These modalities include two visual (an egocentric and a geocentric map), 3D spatialized audio, tactile, and the multimodal combinations of each. We also examined individual differences in sense of direction as a potential moderator of display usage. Results provide preliminary evidence that localized 3D audio and haptics navigation aids are an intuitive, efficient, and effective means of waypoint navigation, regardless of sense of direction.
Loss of awareness in one's immediate surroundings can have devastating results when navigating. For instance, military operators must often navigate in unfamiliar environments and must be able to detect nearby threats to survive. Visual displays such as paper or digital maps can draw visual attention away from one's environment. We developed a navigation display that guides a user through a series of waypoints by playing a 3D audio tone over headphones or vibrating a tactor on an array around the torso. We evaluated the navigation display by having participants navigate through 32 waypoints in an open field. In addition to evaluating auditory and vibrotactile cues, we considered an analog visual cue, an allocentric map, and an egocentric map. Participants were able to reach all waypoints in every condition. Results suggest that the participants reached waypoints fastest with the egocentric map. Additionally, participants were slightly faster with the auditory cue than with the vibrotactile cue. Subjective workload and usability questionnaires found that both of these conditions were not mentally demanding and highly usable. These results help support the development of eye-free mobile navigation tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.