The paper reports on the ability of people to rapidly adapt in localizing virtual sound sources in both azimuth and elevation when listening to sounds synthesized using non-individualized head-related transfer functions (HRTFs). Participants were placed within an audio-kinesthetic Virtual Auditory Environment (VAE) platform that allows association of the physical position of a virtual sound source with an alternate set of acoustic spectral cues through the use of a tracked physical ball manipulated by the subject. This set-up offers a natural perception-action coupling, which is not limited to the visual field of view. The experiment consisted of three sessions: an initial localization test to evaluate participants' performance, an adaptation session, and a subsequent localization test. A reference control group was included using individual measured HRTFs. Results show significant improvement in localization performance. Relative to the control group, participants using non-individual HRTFs reduced localization errors in elevation by 10° with three sessions of 12 min. No significant improvement was found for azimuthal errors or for single session adaptation.
In the context of binaural audio rendering, choosing the best head-related transfer function (HRTF) for an individual from large databases poses several problems. This study proposes a method to reduce the size of a given HRTF database. Participants, 45 in total, were asked to rate the quality of binaural synthesis for 46 HRTFs. The lack of reciprocity in the ratings was noted. Results were used to create a perceptually optimized HRTF subset which satisfied all participants' judgments. The subset was validated using localization tests on a separate group of subjects with results showing reduced errors when subjects were given their best choice, rather than their worst choice HRTF.
It has long been suggested that sound plays a role in the postural control process. Few studies however have explored sound and posture interactions. The present paper focuses on the specific impact of audition on posture, seeking to determine the attributes of sound that may be useful for postural purposes. We investigated the postural sway of young, healthy blindfolded subjects in two experiments involving different static auditory environments. In the first experiment, we compared effect on sway in a simple environment built from three static sound sources in two different rooms: a normal vs. an anechoic room. In the second experiment, the same auditory environment was enriched in various ways, including the ambisonics synthesis of a immersive environment, and subjects stood on two different surfaces: a foam vs. a normal surface. The results of both experiments suggest that the spatial cues provided by sound can be used to improve postural stability. The richer the auditory environment, the better this stabilization. We interpret these results by invoking the “spatial hearing map” theory: listeners build their own mental representation of their surrounding environment, which provides them with spatial landmarks that help them to better stabilize.
International audienceThis article aims to reveal the efficiency of sonification strategies in terms of rapidity, precision and overshooting in the case of a one-dimensional guidance task. The sonification strategies are based on the four main perceptual attributes of a sound (i.e. pitch, loudness, duration/tempo and timbre) and classified with respect to the presence or not of one or several auditory references. Perceptual evaluations are used to display the strategies in a precision/rapidity space and enable prediction of user behavior for a chosen sonification strategy. The evaluation of sonification strategies constitutes a first step toward general guidelines for sound design in interactive multimedia systems that involve guidance issues
Postural control is known to be the result of the integration and processing of various sensory inputs by the central nervous system. Among the various afferent inputs, the role of auditory information in postural regulation has been addressed in relatively few studies, which led to conflicting results. The purpose of the present study was to investigate the influence of a rotating auditory stimulus, delivered by an immersive 3D sound spatialization system, on the standing posture of young subjects. The postural sway of 20 upright, blindfolded subjects was recorded using a force platform. Use of various sound source rotation velocities followed by sudden immobilization of the sound was compared with two control conditions: no sound and a stationary sound source. The experiment showed that subjects reduced their body sway amplitude and velocity in the presence of rotating sound compared with the control conditions. The faster the sound source was rotating, the greater the reduction in subject body sway. Moreover, disruption of subject postural regulation was observed as soon as the sound source was immobilized. These results suggest that auditory information cannot be neglected in postural control and that it acts as additional information influencing postural regulation.
Finding ones way to an unknown destination, navigating complex routes, finding inanimate objects; these are all tasks that can be challenging for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed towards increasing the autonomy of visually impaired users in known and unknown environments, exterior and interior, large scale and small scale, through a combination of a Global Navigation Satellite System (GNSS) and rapid visual recognition with which the precise position of the user can be determined. Relying on geographical databases and visually identified objects, the user is guided to his or her desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of the new type of detection and localization device are presented in relation to guidance directives developed through participative design with potential users and educators for the visually impaired. A fundamental concept in this project is the belief that this type of assistive device is able to solve one of the major problems faced by the visually impaired: their difficulty in localizing specific objects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.