Virtual reality users wearing head-mounted displays can experience the illusion of walking in any direction for infinite distance while, in reality, they are walking a curvilinear path in physical space. This is accomplished by introducing unnoticeable rotations to the virtual environment-a technique called redirected walking. This paper gives an overview of the research that has been performed since redirected walking was first practically demonstrated 15 years ago.
The rapid development and availability of low-cost technologies have created a wide interest in virtual reality. In the field of computer music, the term “virtual musical instruments” has been used for a long time to describe software simulations, extensions of existing musical instruments, and ways to control them with new interfaces for musical expression. Virtual reality musical instruments (VRMIs) that include a simulated visual component delivered via a head-mounted display or other forms of immersive visualization have not yet received much attention. In this article, we present a field overview of VRMIs from the viewpoint of the performer. We propose nine design guidelines, describe evaluation methods, analyze case studies, and consider future challenges.
Recent technological developments have finally brought virtual reality (VR) out of the laboratory and into the hands of developers and consumers. However, a number of challenges remain. Virtual travel is one of the most common and universal tasks performed inside virtual environments, yet enabling users to navigate virtual environments is not a trivial challenge—especially if the user is walking. In this article, we initially provide an overview of the numerous virtual travel techniques that have been proposed prior to the commercialization of VR. Then we turn to the mode of travel that is the most difficult to facilitate, that is, walking. The challenge of providing users with natural walking experiences in VR can be divided into two separate, albeit related, challenges: (1) enabling unconstrained walking in virtual worlds that are larger than the tracked physical space and (2) providing users with appropriate multisensory stimuli in response to their interaction with the virtual environment. In regard to the first challenge, we present walking techniques falling into three general categories: repositioning systems, locomotion based on proxy gestures, and redirected walking. With respect to multimodal stimuli, we focus on how to provide three types of information: external sensory information (visual, auditory, and cutaneous), internal sensory information (vestibular and kinesthetic/proprioceptive), and efferent information. Finally, we discuss how the different categories of walking techniques compare and discuss the challenges still facing the research community.
Abstract. The ability of haptic stimuli to augment visually and auditorily induced self-motion illusions has in part been investigated. However, haptically induced illusory self-motion in environments deprived of explicit motion cues remain unexplored. In this paper we present an experiment performed with the intention of investigating how different virtual environments -contexts of motion -influences self-motion illusions induced through haptic stimulation of the feet. A concurrent goal was to determine whether horizontal self-motion illusions can be induced through stimulation of the supporting areas of the feet. The experiment was based on the a within-subjects design and included four conditions, each representing one context of motion: an elevator, a train compartment, a bathroom, and a completely dark environment. The audiohaptic stimuli was identical across all conditions. The participants' sensation of movement was assessed by means of existing measures of illusory self-motion, namely, reported self-motion illusion per stimulus type, illusion compellingness, intensity and onset time. Finally the participants were also asked to estimate the experienced direction of movement. While the data obtained from all measures did not yield significant differences, the experiment did provide interesting indications. If motion is simulated through implicit motion cues, then the perceived context does influence the magnitude of displacement and the direction of movement of self-motion illusions as well as whether the illusion is experienced in the first place. Finally, the experiment confirmed that haptically induced illusory self-motion in the horizontal plane is indeed possible.
Circular and linear self-motion illusions induced through visual and auditory stimuli have been studied rather extensively. While the ability of haptic stimuli to augment such illusions has been investigated, the self-motion illusions which primarily are induced by stimulation of the haptic modality remain relatively unexplored.In this paper, we present an experiment performed with the intention of investigating whether it is possible to use haptic stimulation of the main supporting areas of the feet to induce vertical illusory self-motion on behalf of unrestrained participants during exposure to a virtual environment depicting an elevator. The experiment was based on a within-subjects design where all participants were subjected to identical visual and auditory stimuli. The participants experienced a total of four conditions. For three of the conditions a different signal was used to generate the haptic feedback while the final condition included no haptic feedback. Analysis of self-reports were used to assess the participants' experience of illusory self-motion. The results indicate that such illusions are indeed possible. Significant differences were found between the condition including no haptic feedback and the remaining three conditions.
A high-fidelity but efficient sound simulation is an essential element of any VR experience. Many of the techniques used in virtual acoustics are graphical rendering techniques suitably modified to account for sound generation and propagation. In recent years, several advances in hardware and software technologies have been facilitating the development of immersive interactive sound-rendering experiences. In this article, we present a review of the state of the art of such simulations, with a focus on the different elements that, combined, provide a complete interactive sonic experience. This includes physics-based simulation of sound effects and their propagation in space together with binaural rendering to simulate the position of sound sources. We present how these different elements of the sound design pipeline have been addressed in the literature, trying to find the trade-off between accuracy and plausibility. Recent applications and current challenges are also presented.
Walking-In-Place (WIP) techniques make it possible to facilitate relatively natural locomotion within immersive virtual environments that are larger than the physical interaction space. However, in order to facilitate natural walking experiences one needs to know how to map steps in place to virtual motion. This paper describes two within-subjects studies performed with the intention of establishing the range of perceptually natural walking speeds for WIP locomotion. In both studies, subjects performed a series of virtual walks while exposed to visual gains (optic flow multipliers) ranging from 1.0 to 3.0. Thus, the slowest speed was equal to an estimate of the subjects normal walking speed, while the highest speed was three times greater. The perceived naturalness of the visual speed was assessed using self-reports. The first study compared four different types of movement, namely, no leg movement, walking on a treadmill, and two forms of gestural input for WIP locomotion. The results suggest that WIP locomotion is accompanied by a perceptual distortion of the speed of optic flow. The second study was performed using a 4×2 factorial design and compared four different display field-of-views (FOVs) and two types of movement, walking on a treadmill and WIP locomotion. The results revealed significant main effects of both movement type and field of view, but no significant interaction between the two variables. Particularly, they suggest that the size of the display FOV is inversely proportional to the degree of underestimation of the virtual speeds for both treadmill-mediated virtual walking and WIP locomotion. Combined, the results constitute a first attempt at establishing a set of guidelines specifying what virtual walking speeds WIP gestures should produce in order to facilitate a natural walking experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.