Abstract. Current immersive Virtual Reality (VR) system strategies do not fully support dynamic Human Computer Interaction (HCI) and since there is a growing need for better immersion, due consideration should be given to integrate additional modalities for improved HCI. While feedback in Virtual Environments (VE) is predominantly provided to the user through the visual and auditory channels, additional modalities such as haptics can increase the sense of presence and efficiency in VE simulations. Haptic interfaces can enhance the VE interaction by enabling users to "touch" and "feel" virtual objects that are simulated in the environment. This paper examines the reasons behind its integration based on the limitations of present immersive projection system.
Abstract. Our eyes are input sensors which provide our brains with streams of visual data. They have evolved to be extremely efficient, and they will constantly dart to-and-fro to rapidly build up a picture of the salient entities in a viewed scene. These actions are almost subconscious. However, they can provide telling signs of how the brain is decoding the visuals, and can indicate emotional responses, prior to the viewer becoming aware of them.In this paper we discuss a method of tracking a user's eye movements, and use these to calculate their gaze within an immersive virtual environment. We investigate how these gaze patterns can be captured and used to identify viewed virtual objects, and discuss how this can be used as a natural method of interacting with the Virtual Environment. We describe a flexible tool that has been developed to achieve this, and detail initial validating applications that prove the concept.
Abstract. Desktop computers are able to provide a user interface with many features that allow the user to perform tasks such as execute applications load files and edit data. The gMenu system proposed in this paper is a step closer to having these same facilities in virtual reality systems. The gMenu can currently be used to perform a selection of common tasks provided by a user interface, for example executing or closing virtual reality applications or scenes. It is fully customisable and can be used to create many different styles of menu by both programmers and users. It also has shown promising results bringing some of the system based commands into the virtual environment, as well as keeping the functionality and adaptions required by applications. The use cases presented demonstrate a collection of these abilities.
Abstract. Within the confines of a Virtual Environment (VE) almost anything is possible. It is easy to establish the benefits such an application could provide throughout the many walks of life, and yet current VE development remains within the domain of Virtual Reality application programmers. We describe methods that enhance VE development, first by providing scene creation for non-programmers, and second through a scene management entity that controls interaction within the environment. We explore methods for interacting through the scene to enable multiuser collaboration, and detail sample applications making use of this approach.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.