2019
DOI: 10.1109/tvcg.2019.2898787
|View full text |Cite
|
Sign up to set email alerts
|

Auditory Feedback for Navigation with Echoes in Virtual Environments: Training Procedure and Orientation Strategies

Abstract: 1 2 3 4 Fig. 1: The virtual environment used for training human echolocation resembled a dark virtual cave (left panel). Test participants performed a navigation task which consists in finding the exit of a tunnel to the opening of the cave (right panel, 1-4 are photographs in sequence of a trial) with different types of unimodal auditory or visual feedback. Real-time auralization was designed within Steam Audio engine and delivered through headphones. An Oculus Rift and Touch controller supported the navigati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 47 publications
0
6
0
Order By: Relevance
“…For example, adding auditory cues to VR can improve source localization via crossmodal plasticity, and thus heighten the sense of presence within VR, while avoiding the need for complex individualized calculations (to enable accurate auditory source localization) (Berger et al, 2018). Additionally, recent work has explored the role of echolocation in VR (via self-produced auditory 'clicks') to assist with spatial localization, maze completion times, and environment exploration (Andreasen et al, 2018(Andreasen et al, , 2019. Intriguingly, navigating a VR environment 'like a bat' allowed some participants to create cognitive spatial maps based on echolocation, with concurrent improvement in performance (Andreasen et al, 2019).…”
Section: Interim Summarymentioning
confidence: 99%
See 1 more Smart Citation
“…For example, adding auditory cues to VR can improve source localization via crossmodal plasticity, and thus heighten the sense of presence within VR, while avoiding the need for complex individualized calculations (to enable accurate auditory source localization) (Berger et al, 2018). Additionally, recent work has explored the role of echolocation in VR (via self-produced auditory 'clicks') to assist with spatial localization, maze completion times, and environment exploration (Andreasen et al, 2018(Andreasen et al, , 2019. Intriguingly, navigating a VR environment 'like a bat' allowed some participants to create cognitive spatial maps based on echolocation, with concurrent improvement in performance (Andreasen et al, 2019).…”
Section: Interim Summarymentioning
confidence: 99%
“…Additionally, recent work has explored the role of echolocation in VR (via self-produced auditory 'clicks') to assist with spatial localization, maze completion times, and environment exploration (Andreasen et al, 2018(Andreasen et al, , 2019. Intriguingly, navigating a VR environment 'like a bat' allowed some participants to create cognitive spatial maps based on echolocation, with concurrent improvement in performance (Andreasen et al, 2019). Thus non-typical auditory cues may be able to update self-generated movement in VR, although high training levels appear necessary when the information conveyed by auditory input is non-traditional.…”
Section: Interim Summarymentioning
confidence: 99%
“…Finally, it is worthwhile to notice that virtual room acoustics is important for a natural perception of reconstructed sound scenes, providing a recognizable acoustic fingerprint of specific location or event [23]. is is particularly relevant for echolocation abilities, which rely on DOA of echos in the room which could be effectively rendered with current VR technologies [24], such as those employed in this study (see Section 2.2 on binaural audio technologies). Moreover, thanks to the circular interaction between spatial presence and emotions: one can consider VR an affective medium [25] which is able to interact with user's affective states [26] and memory processes [27].…”
Section: Related Workmentioning
confidence: 98%
“…However, new stereoscopic rendering techniques allow to present content in 3D, and therefore perception of materials, depth, or many other cues can be achieved through visual cues [7]. Auditory feedback, which is often integrated in the HMD as a built-in feature (Table 1), is generally enabled by speakers or headphones, and spatial audio rendering techniques also support our perception of space in virtual environments [6], even enhancing perceived visual properties [115]. Haptic feedback is still in an exploratory phase, and can be achieved through a variety of sensors, Fig.…”
Section: Reproducing Sensory Modalities In Vrmentioning
confidence: 99%