This paper summarizes recent developments in audio and tactile feedback based assistive technologies targeting the blind community. Current technology allows applications to be efficiently distributed and run on mobile and handheld devices, even in cases where computational requirements are significant. As a result, electronic travel aids, navigational assistance modules, text-to-speech applications, as well as virtual audio displays which combine audio with haptic channels are becoming integrated into standard mobile devices. This trend, combined with the appearance of increasingly user-friendly interfaces and modes of interaction has opened a variety of new perspectives for the rehabilitation and training of users with visual impairments. The goal of this paper is to provide an overview of these developments based on recent advances in basic research and application development. Using this overview as a foundation, an agenda is outlined for future research in mobile interaction design with respect to users with special needs, as well as ultimately in relation to sensor-bridging applications in general.
Introduction: As the number of people with visual impairments is continuously increasing, rehabilitation and engineering researchers have identified the need to design sensory substitution devices that would offer assistance and guidance to the blind users for performing navigational tasks. Auditory and haptic cues have been shown to be an effective approach towards creating a rich spatial representation of the environment, so that they are considered for being included into the development of assistive tools that would enable the people with visual impairments to acquire knowledge of the surrounding space in a way close to the visual based perception of the normally sighted individuals. However, achieving efficiency through a sensory substitution device requires extensive training for the blind users to learn how to process the artificial auditory cues and convert them into spatial information. Methods: Considering all the potential advantages which game based learning can provide, we propose a new method for training sound localization and virtual navigational skills of the people with visual impairments in a 3D audio game with hierarchical levels of difficulty. The training procedure is focused on a multimodal (auditory and haptic) learning approach in which the subjects have been asked to listen to 3D sounds while simultaneously perceiving a series of vibrations on a haptic headband, corresponding to the direction of the sound source in space. Results: The results we obtained in a sound localization experiment with 10 people suffering from visual impairments showed that the proposed training strategy resulted in significant improvements in the auditory performance and navigation skills of the subjects, thus ensuring behavioral gains in the spatial perception of the environment.
Measurements were conducted using a navigational application on an Android Smartphone that provides auditory and haptic feedback based on electromagnetic sensor data (compass) in order to help users walk in a straight line. Blindfolded sighted subjects attempted to walk along a 40-meter path with and without navigational assistance. Results showed that optimal settings of accuracy, sensitivity and target direction on the device can significantly reduce veering. Further, it was shown that even a short training session consisting of four trials could lead to better subsequent performance without navigational assistance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.