The gaming experience is a critical component in the development and evolution of games. This study aims to assess the differences between virtual reality and a more traditional gaming experience in levels of emotion, immersion, and presence. We know that virtual reality is changing the way that people play games, but the extent to which virtual reality creates a more immersive and more present environment is still largely debated. Most of us can feel a difference but where does it lie? Similarly, if we are indeed more immersed and present, does that impact our perceived workload and how we feel about the task? These questions are examined in more detail in this paper. Results of this study may help give statistical support to various factors that are related to the virtual reality experience. These findings may extrapolate to a larger context for training simulations for military applications as well as consumer domains.
The purpose of this study was to examine the effects of color on eye movements while viewing a restaurant menu. Heat maps suggest that participants tended to view the middle and upper-left parts of the menu the most, regardless of color upon first exposure. Gaze plots showing the order of fixations indicated that color may have impacted initial eye movements in the first 10 seconds. Participants tended to view the center of the menu first in the color condition and the top left portion of the menu first in the non-color version. These results may be useful when designing restaurant menus and understanding the role color may have when attracting users’ gaze.
Mixed reality is a new technology that requires users to control a head-mounted device via gestures with their hands. Users of these devices must learn and remember a new way of interacting. It has been shown that creating gestures that resemble movements used to operate touch screens can help with this new transfer. This study investigates how well people learn to use the out-of-the-box gestures for a mixed reality headset, Microsoft HoloLens, after interacting with it for a very short period of time. Performance with the gestures was measured with novices before and after approximately five minutes of practice game play. Participants showed a significant improvement on the gestures to open and position windows and reported them to be easier to do after the short practice. This information could help to create apps or tutorials that help teach these gestures, as well as identifying which gestures are more intuitive to users.
Immersive simulation technology has transformed the training and learning environment. Virtual reality (VR) and augmented reality (AR) devices have been adopted by medical professionals, military forces, and marketing firms. Aviation training facilities are also integrating VR and AR technology into a variety of training. To ensure students begin training on equal footing, an engaging, guided tutorial for the virtual environment (VE) was created. A usability study was conducted to evaluate the tutorial’s learnability, effectiveness, and satisfaction for two user groups varying in VR experience. Results show users found the tutorial enjoyable with high usability and playability. Novice users reported the tutorial as more mentally effortful than expert users and were less comfortable with self-maneuvering. Users successfully completed most tasks on the first attempt after completing the tutorial. Those who noted difficulty in completing tasks in a post-assessment reported user error and corrected themselves without instruction. The tutorial demonstrated learnability, effectiveness, and satisfaction ensuring that users will be able to enter the VE with more confidence after engaging with the tutorial.
Computer-based training is a version of training that is becoming quite common. The Project Team Builder (PTB) program uses a computer simulation to help train individuals on how to correct a variety of problems faced by project managers. In this study, the use of a computer-based training program was utilized to assess learning and performance. To measure this, a Tobii T120 eye tracking system was employed to measure fixations and saccades of the participant while completing a training program. The task consisted of the completion of a project management training with assistance of the researcher, then again without assistance. It was hypothesized that performance would improve on the self-guided task after the completion of the assisted task. Results found that participants that completed the task quicker found it to be less mentally demanding. Number of fixations was found to be positively correlated with mental workload and perceived performance. This gives way to further research on computer-based training and the use of eye tracking systems for future program development.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.