Current design of virtual reality (VR) applications relies essentially on the transposition of users' viewpoint in first-person perspective (1PP). Within this context, our research aims to compare the impact and the potentialities enabled via the integration of the third-person perspective (3PP) in immersive virtual environments (IVE). Our empirical study is conducted in order to assess the sense of presence, the sense of embodiment, and performance of users confronted with a series of tasks presenting a case of potential use for the video game industry. Our results do not reveal significant differences concerning the sense of spatial presence with either point of view. Nonetheless, they provide evidence confirming the relevance of using the first-person perspective to induce a sense of embodiment toward a virtual body, especially in terms of self-location and ownership. However, no significant differences were observed concerning the sense of agency. Concerning users' performance, our results demonstrate that the first-person perspective enables more accurate interactions, while the third-person perspective provides better space awareness.
This study presents the second phase of a series of experiments investigating the impact of avatar visual fidelity on the sense of embodiment and users' behavior in immersive virtual environments. Our main focus concerns the similarity between users and avatars, a factor known as truthfulness. Our experiment requires the participants to control three avatars using a third-person perspective: a robot, a suit and their virtual doppelganger (virtual representation of the self). In order to analyze users' reactions and strategies, each task of the scenario of the virtual reality application can potentially affect the integrity of their characters. Our results revealed that ownership, one of the three factors of the sense of embodiment, is higher for the participants controlling their self-representation than with abstract representations. Furthermore, avatar visual fidelity seems to affect users' subjective experience, half of the panel reported having different behavior depending on the controlled character. Abstract representations allow the users to adopt more risky behaviors, while self-representations maintain a connection with the real world and encourage users to preserve the integrity of their avatar.
It has been demonstrated that virtual reality (VR) exposure can affect the subjective experience of different situations, cognitive capabilities or behavior. It is known that there is a link between a person's physiological state and their psychological self-report and user experience. As an immersive experience can affect users' physiological data, it is possible to adapt and enhance the content of a virtual environment in real-time base on physiological data feedback (biofeedback). With the rapid evolution of the physiological monitoring technologies, it is now possible to exploit different modalities of biofeedback, in a cheap and non-cumbersome manner, and study how they can affect user experience. While most of the studies involving physiological data use it as a measuring tool, we want to study its impact when direct and voluntary physiological control becomes a mean of interaction. To do so, we created a two-parts protocol. The first part was designed to categorize the participants on their heart rate control competency. In the second part of the study, we immersed our participants in a VR experience where they must control their heart rate to interact with the elements in the game. The results were analyzed based on the competency distribution. We observed consistent results between our competency scale and the participants' control of the biofeedback game mechanic. We also found that our direct biofeedback mechanic is highly engaging. We observed that it generated a strong feeling of agency, which is linked with users' level of heart rate control. We highlighted the richness of biofeedback as a direct game mechanic, prompting interesting perspective for personalized immersive experiences.
Virtual Reality (VR) is now an affordable technology that is starting to penetrate the mass market. Providing accessible solutions to enhance VR experiences is crucial. In this paper, we consider a wearable solution as a mean of interaction in VR, to add a biofeedback mechanic. We hypothesized that the use of a biofeedback loop in a VR experience can enhance user engagement. We created a physiologically enhanced horror game coupled with a heart rate monitor smart wristband. We evaluated the players' engagement with and without biofeedback. We observed a high interest of the participants for biofeedback and highlighted higher engagement when the biofeedback mechanic was fully integrated in the experience. CCS CONCEPTS • Human-centered computing → Virtual reality; User centered design;
Virtual Reality (VR) is now an affordable technology that is starting to penetrate the mass market. While cardboard is the most distributed system, it lacks interaction to provide really engaging experiences. Providing low cost solutions to enhance VR experiences is crucial. We hypothesized that the integration of a smart wristband in a VR experience, provide a reliable and comfortable enough setup to add a biofeedback loop to a game. We created a physiologically enhanced game and coupled it with a smart wristband capable of monitoring one's heart rate. We tested our game with and without biofeedback and compared the reported novelty. We observed a high interest of the participants for the integration of smart wearables in VR. We highlighted the stability of our setup, even in mobility and the reported absence of discomfort created by the addition of the wristband. CCS CONCEPTS • Human-centered computing → Virtual reality; User centered design;
This study presents two experiments addressing the representation of scientific data, in particular airflows, with a user-centered design approach. Our objective is to provide users feedback to data visualization designers to help them choose an air flow representation that is understandable and attractive for non-experts. The first study focuses on static markers allowing to visualize an airflow, with information characterizing the direction and the intensity. In a second study, carried out in an immersive virtual environment, two information were added, the temporal evolution and the concentration of pollutants in the air. To measure comprehension and attractiveness, participants were asked to answer items on Likert scales (experiment 1) and to answer User Experience Questionnaire (experiment 2). The results revealed that arrows seem to be a very common and understandable form to represent orientation and direction of flow, but that they should be improved to be more attractive by making them brighter and more transparent, as the representation could occlude the scene, especially in virtual reality. To solve this problem, we suggest giving the users the ability to define the specific area where they want to see the air flow, using a cross-sectional view. Vector fields and streamlines could therefore be applied in a virtual reality context.
Research showed that immersive technologies can significantly improve the design process. However, it is important to consider the ease of implementation of solutions (e.g. price, simplicity). Therefore, the objective of this study was to analyze the uses of two types of virtual environments that are relatively simple to implement: a basic model of a room and its 3D scan. Participants made sketches using a virtual reality application, provided by the instructors, in each of the two different VR environments. The sketches are proposals to a furniture co-creation task. Results indicate a better co-creation process during the second session than during the first, which reveals that training is an important criterion in this case. Furthermore, co-creation is felt to be better in the case of the modeled place compared to the 3D scan. This result could be due to the presence of useless virtual objects that can cause a distraction to the participants. These results are discussed from an applicative standpoint.
Eliciting a sense of social presence is necessary to create believable multi-user situations in immersive virtual environments. To be able to collaborate in virtual worlds, users are represented by avatars (virtual characters controlled in real time) allowing them to interact with each other. We report a study investigating the impact on social presence of both non-human avatars' facial properties and of the type of collaborative task being performed by the users (asymmetric collaboration versus negotiation). While we observed no significant impact of facial properties, both co-presence and perceived message understanding scores were significantly higher during the negotiation task.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.