2020
DOI: 10.1101/2020.03.19.998906
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SPHERE: A novel approach to 3D and active sound localization

Abstract: In everyday life, localizing a sound source in free-field entails more than the sole extraction of monaural and binaural auditory cues to define its location in the three-dimensions (azimuth, elevation and distance). In spatial hearing, we also take into account all the available visual information (e.g., cues to sound position, cues to the structure of the environment), and we resolve perceptual ambiguities through active listening behavior, exploring the auditory environment with head or/and body movements. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 67 publications
(91 reference statements)
0
3
0
Order By: Relevance
“…The software is designed to guide the experimenter to align the real loudspeaker (i.e., the sound source) with a set of pre-determined positions defined in the 3D virtual environment in each trial. This method combining virtual reality and kinematic tracking to measure sound localization abilities has been developed in our laboratory [ 37 ] and has been already adopted in previous studies [ 25 , 38 ].…”
Section: Methodsmentioning
confidence: 99%
“…The software is designed to guide the experimenter to align the real loudspeaker (i.e., the sound source) with a set of pre-determined positions defined in the 3D virtual environment in each trial. This method combining virtual reality and kinematic tracking to measure sound localization abilities has been developed in our laboratory [ 37 ] and has been already adopted in previous studies [ 25 , 38 ].…”
Section: Methodsmentioning
confidence: 99%
“…This task was carried out entirely in VR, always using real sounds delivered in free field from predetermined positions computed on each trial based on the initial head position (Coudert et al 2022;Valzolgher et al 2020a,c). Our VR approach brings real sounds into visual VR and coordinates everything using Unity (see the Materials in Supplemental Digital Content 2, http://links.lww.com/EANDH/B43 for details about the apparatus and also Gaveau et al 2020). Participants were immersed in a virtual room that matched the size of the real one but was devoid of any objects.…”
Section: Head-pointing Sound Localization Taskmentioning
confidence: 99%
“…When the correct posture was reached, the fixation cross turned from white to blue. In the meanwhile, the experimenter placed the speaker in one of the possible eight predetermined positions, using visual indications provided on a dedicated monitor (see Verdelet et al 2019 andGaveau et al 2020 for a validation of this method and Valzolgher et al 2020a,c for examples). The predetermined positions (shown in the top part of Fig.…”
Section: Head-pointing Sound Localization Taskmentioning
confidence: 99%