Many virtual reality (VR) games are based on a first-person perspective (1PP). There are, however, advantages in using another perspective, such as the third-person perspective (3PP). Although there has been some research evaluating the effect of 1PP and 3PP in gameplay experiences, it is largely unexplored for VR games played via the new generation of commercial head-mounted display systems, such as the Oculus Rift. In this research we want to shed some light on the relationship between the different perspectives, when games are played using head-mounted display VR, and simulator sickness, enjoyment, and presence. To do so, we perform an experiment using two different perspectives (1PP and 3PP) and displays (VR and a conventional display) with a popular game. Our findings indicate that 3PP-VR is less likely to make people have simulator sickness when compared with 1PP-VR. However, the former is not perceived as immersive, but this might not be a problem because our data also show that presence is not mandatory for enjoyment. Also, the data suggest that there is no clear preference between 1PP-VR and 3PP-VR for gameplay.
Our access to computer-generated worlds changes the way we feel, how we think, and how we solve problems. In this review, we explore the utility of different types of virtual reality, immersive or non-immersive, for providing controllable, safe environments that enable individual training, neurorehabilitation, or even replacement of lost functions. The neurobiological effects of virtual reality on neuronal plasticity have been shown to result in increased cortical gray matter volumes, higher concentration of electroencephalographic beta-waves, and enhanced cognitive performance. Clinical application of virtual reality is aided by innovative brain–computer interfaces, which allow direct tapping into the electric activity generated by different brain cortical areas for precise voluntary control of connected robotic devices. Virtual reality is also valuable to healthy individuals as a narrative medium for redesigning their individual stories in an integrative process of self-improvement and personal development. Future upgrades of virtual reality-based technologies promise to help humans transcend the limitations of their biological bodies and augment their capacity to mold physical reality to better meet the needs of a globalized world.
Virtual reality technologies (VR) have advanced rapidly in the last few years. Prime examples include the Oculus RIFT and HTC Vive that are both head-worn/mounted displays (HMDs). VR HMDs enable a sense of immersion and allow enhanced natural interaction experiences with 3D objects. In this research we explore suitable interactions for manipulating 3D objects when users are wearing a VR HMD. In particular, this research focuses on a user-elicitation study to identify natural interactions for 3D manipulation using dual-hand controllers, which have become the standard input devices for VR HMDs. A user elicitation study requires potential users to provide interactions that are natural and intuitive based on given scenarios. The results of our study suggest that users prefer interactions that are based on shoulder motions (e.g., shoulder abduction and shoulder horizontal abduction) and elbow flexion movements. In addition, users seem to prefer one-hand interaction, and when two hands are required they prefer interactions that do not require simultaneous hand movements, but instead interactions that allow them to alternate between their hands. Results of our study are applicable to the design of dual-hand interactions with 3D objects in a variety of virtual reality environments.
Textiles are a vital and indispensable part of our clothing that we use daily. They are very flexible, often lightweight, and have a variety of application uses. Today, with the rapid developments in small and flexible sensing materials, textiles can be enhanced and used as input devices for interactive systems. Clothing-based wearable interfaces are suitable for in-vehicle controls. They can combine various modalities to enable users to perform simple, natural, and efficient interactions while minimizing any negative effect on their driving. Research on clothing-based wearable in-vehicle interfaces is still underexplored. As such, there is a lack of understanding of how to use textile-based input for in-vehicle controls. As a first step towards filling this gap, we have conducted a user-elicitation study to involve users in the process of designing in-vehicle interactions via a fabric-based wearable device. We have been able to distill a taxonomy of wrist and touch gestures for in-vehicle interactions using a fabric-based wrist interface in a simulated driving setup. Our results help drive forward the investigation of the design space of clothing-based wearable interfaces for in-vehicle secondary interactions.
Advanced developments in handheld devices' interactive 3D graphics capabilities, processing power, and cloud computing have provided great potential for handheld augmented reality (HAR) applications, which allow users to access digital information anytime, anywhere. Nevertheless, existing interaction methods are still confined to the touch display, device camera, and built-in sensors of these handheld devices, which suffer from obtrusive interactions with AR content. Wearable fabric-based interfaces promote subtle, natural, and eyes-free interactions which are needed when performing interactions in dynamic environments. Prior studies explored the possibilities of using fabric-based wearable interfaces for head-mounted AR display (HMD) devices. The interface metaphors of HMD AR devices are inadequate for handheld AR devices as a typical HAR application require users to use only one hand to perform interactions. In this paper, we aim to investigate the use of a fabric-based wearable device as an alternative interface option for performing interactions with HAR applications. We elicited user-preferred gestures which are socially acceptable and comfortable to use for HAR devices. We also derived an interaction vocabulary of the wrist and thumb-to-index touch gestures, and present broader design guidelines for fabric-based wearable interfaces for handheld augmented reality applications. Appl. Sci. 2019, 9, 3177 2 of 21 and portable enough to be carried wherever users go. With this ubiquitous availability, HAR allows us to develop and design innovative applications in navigation, education, gaming, tourism, interactive shopping, production, marketing, and others [3]. Thus, smartphones have been identified as an ideal platform for HAR experiences in various outdoor and indoor environments [4][5][6].In order to interact with the virtual world using HAR displays, a user needs to position and orientate the device using one hand and manipulate the virtual 3D objects with the other hand. In general, the touchscreen is used as a primary interface to interact with AR content [7,8]. In addition, the various built-in sensors in the handheld devices-such as cameras, GPS, compass, accelerometers, and gyroscope-enable to precisely determine the position and orientation of the device in the real world (e.g., [8][9][10]). Furthermore, the device's camera is used to naturally capture the user's mid-air hand movements while holding the device [11,12].Like in HMD AR, manipulations such as selecting and moving virtual 3D information are primary interactions in HAR devices [13]. The existing HAR interaction methods, such as touch input, offer promising solutions to manipulate virtual content (e.g., [14]). However, they still have substantial limitations. For instance, touch input is limited by the device's physical boundary and usability suffers as on-screen content becomes occluded by finger (i.e., finger occlusions [15,16]). Also, 2D inputs on the touch surface do not directly support manipulating the six degrees of freedom of a virt...
How to enhance creativity, especially by applying new technologies to creativity methods, is a question posed continuously by researchers. One reason for this is that creativity is an important part of people's daily lives and an essential component of society. Therefore, many methods to enhance creativity have been created, especially brainstorming, which is one of the most popular and effective tools that inspire individuals to generate ideas, hence enhancing creativity. Moreover, technologies, such as virtual reality (VR), provide an opportunity for individuals and groups to be creative. In response, recent studies have adopted VR in brainstorming to enhance creativity. However, there is a lack of systematic analysis of experimental approaches and creativity measures employed in this context. Addressing this question, this study categorized existing articles on the topic related to categories of avatars, environments, interfaces, or applications. The findings elaborate on trends, measures employed to evaluate creativity and idea generation, identified categories, and results of these studies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.