A gap exists between virtual reality (VR) software platforms designed for optimum hardware abstraction and cluster support, and those designed for efficient content authoring and exploration of interaction techniques through prototyping. This paper describes VR JuggLua, a high-level virtual reality application framework based on combining Lua, a dynamic, interpreted language designed for embedding and extension, with VR Juggler and OpenSceneGraph. This work allows fully-featured immersive applications to be written entirely in Lua, and also supports the embedding of the Lua engine in C++ applications. Like native C++ VR Juggler applications, VR JuggLua-based applications run successfully on systems ranging from a single desktop machine to a 49-node cluster. The osgLua introspection-based bindings facilitate scenegraph manipulation from Lua code, while bindings created using the Luabind template meta-programming library connect VR Juggler functionality. A thread-safe run buffer allows new Lua code to be passed to the interpreter during run time, supporting interactive creation of scene-graph structures. It has been successfully used in an immersive application implementing two different navigation techniques entirely in Lua and a physically-based virtual assembly simulation where C++ code handles physics computations and Lua code handles all display and configuration. ABSTRACTA gap exists between virtual reality (VR) software platforms designed for optimum hardware abstraction and cluster support, and those designed for efficient content authoring and exploration of interaction techniques through prototyping. This paper describes VR JuggLua, a high-level virtual reality application framework based on combining Lua, a dynamic, interpreted language designed for embedding and extension, with VR Juggler and OpenSceneGraph. This work allows fully-featured immersive applications to be written entirely in Lua, and also supports the embedding of the Lua engine in C++ applications. Like native C++ VR Juggler applications, VR JuggLua-based applications run successfully on systems ranging from a single desktop machine to a 49-node cluster. The osgLua introspection-based bindings facilitate scenegraph manipulation from Lua code, while bindings created using the Luabind template meta-programming library connect VR Juggler functionality. A thread-safe run buffer allows new Lua code to be passed to the interpreter during run time, supporting interactive creation of scene-graph structures. It has been successfully used in an immersive application implementing two different navigation techniques entirely in Lua and a physically-based virtual assembly simulation where C++ code handles physics computations and Lua code handles all display and configuration.
Haptic force-feedback offers a valuable cue in exploration and manipulation of virtual environments. However, grounding of many commercial kinesthetic haptic devices limits the workspace accessible using a purely position-control scheme. The bubble technique has been recently presented as a method for expanding the user's haptic workspace. The bubble technique is a hybrid position-rate control system in which a volume, or "bubble," is defined entirely within the physical workspace of the haptic device. When the device's end effector is within this bubble, interaction is through position control. When exiting this volume, an elastic restoring force is rendered, and a rate is applied that moves the virtual accessible workspace. Existing work on the bubble technique focuses on point-based touching tasks. When the bubble technique is applied to simulations where the user is grasping virtual objects with part-part collision detection, unforeseen interaction problems surface. This paper discusses three details of the user experience of coupled-object manipulation with the bubble technique. A few preliminary methods of addressing these interaction challenges are introduced. Keywords Haptics Disciplines Mechanical Engineering ABSTRACTHaptic force-feedback offers a valuable cue in exploration and manipulation of virtual environments. However, grounding of many commercial kinesthetic haptic devices limits the workspace accessible using a purely position-control scheme. The bubble technique has been recently presented as a method for expanding the user's haptic workspace. The bubble technique is a hybrid position-rate control system in which a volume, or "bubble," is defined entirely within the physical workspace of the haptic device. When the device's end effector is within this bubble, interaction is through position control. When exiting this volume, an elastic restoring force is rendered, and a rate is applied that moves the virtual accessible workspace. Existing work on the bubble technique focuses on point-based touching tasks. When the bubble technique is applied to simulations where the user is grasping virtual objects with part-part collision detection, unforeseen interaction problems surface. This paper discusses three details of the user experience of coupled-object manipulation with the bubble technique. A few preliminary methods of addressing these interaction challenges are introduced.
Ground-based haptic devices provide the capability of adding force feedback to virtual environments; however, the physical workspace of such devices is very limited due to the fixed base. By mounting a haptic device on a mobile robot, rather than a fixed stand, the reachable volume can be extended to function in fullscale virtual environments. This work presents the hardware, software, and integration developed to use such a mobile base with a Haption Virtuose™ 6D35-45. A mobile robot with a Mecanum-style omni-directional drive base and an Arduino-compatible microcontroller development board communicates with software on a host computer to provide a VRPN-based control and data acquisition interface. The position of the mobile robot in the physical space is tracked using an optical tracking system. The SPARTA virtual assembly software was extended to 1) apply transformations to the haptic device data based on the tracked base position, and 2) capture the error between the haptic device's end effector and the center of its workspace and command the robot over VRPN to minimize this error. The completed system allows use of the haptic device in a wide area projection screen or head-mounted display virtual environment, providing smooth free-space motion and stiff display of forces to the user throughout the entire space. The availability of haptics in large immersive environments can contribute to future advances in virtual assembly planning, factory simulation, and other operations where haptics is an essential part of the simulation experience. Disciplines Graphics and Human Computer Interfaces | Mechanical Engineering ABSTRACTGround-based haptic devices provide the capability of adding force feedback to virtual environments; however, the physical workspace of such devices is very limited due to the fixed base. By mounting a haptic device on a mobile robot, rather than a fixed stand, the reachable volume can be extended to function in full-scale virtual environments. This work presents the hardware, software, and integration developed to use such a mobile base with a Haption Virtuose™ 6D35-45. A mobile robot with a Mecanum-style omni-directional drive base and an Arduino-compatible microcontroller development board communicates with software on a host computer to provide a VRPNbased control and data acquisition interface. The position of the mobile robot in the physical space is tracked using an optical tracking system. The SPARTA virtual assembly software was extended to 1) apply transformations to the haptic device data based on the tracked base position, and 2) capture the error between the haptic device's end effector and the center of its workspace and command the robot over VRPN to minimize this error. The completed system allows use of the haptic device in a wide area projection screen or head-mounted display virtual environment, providing smooth free-space motion and stiff display of forces to the user throughout the entire space. The availability of haptics in large immersive environments can cont...
Haptic force-feedback can provide useful cues to users of virtual environments. Bodybased haptic devices are portable but the more commonly used ground-based devices have workspaces that are limited by their physical grounding to a single base position and their operation as purely position-control devices. The "bubble technique" has recently been presented as one method of expanding a user's haptic workspace. The bubble technique is a hybrid position-rate control system in which a volume, or "bubble," is defined entirely within the physical workspace of the haptic device. When the device's end effector is within this bubble, interaction is through position control. When the end effector moves outside this volume, an elastic restoring force is rendered, and a rate is applied that moves the virtual accessible workspace. Publications have described the use of the bubble technique for point-based touching tasks. However, when this technique is applied to simulations where the user is grasping virtual objects with part-to-part collision detection, unforeseen interaction problems surface. Methods of addressing these challenges are introduced, along with discussion of their implementation and an informal investigation.
With the proliferation of large screen stereo display systems, major consumer product manufacturers are using this technology to test marketing ideas on consumers. One of the performance factors that is of interest to retailers or manufacturers of retail products is the ability of consumers to quickly and easily locate their products within a retail store. Virtual reality technology can be used to create a virtual store that is easily reconfigurable as a test environment for consumer feedback. The research presented in this paper involves a study that compares the use of a multi-wall immersive environment to a single-wall immersive environment. Users were given a list of products to find in the virtual store. A physical mockup of a shopping cart was created and instrumented in order to be used to navigate throughout the virtual store. The findings indicate that participants in the five-wall immersive environment were significantly faster in locating the objects than the participants using the one-wall immersive environment. In addition, participants in the five-wall condition reported that the shopping cart was easier to use than in the one-wall condition. This study indicates that the use of multiple walls to provide an increased sense of immersion improves the ability of consumers to locate items within a virtual shopping experience. ABSTRACTWith the proliferation of large screen stereo display systems, major consumer product manufacturers are using this technology to test marketing ideas on consumers. One of the performance factors that is of interest to retailers or manufacturers of retail products is the ability of consumers to quickly and easily locate their products within a retail store. Virtual reality technology can be used to create a virtual store that is easily reconfigurable as a test environment for consumer feedback. The research presented in this paper involves a study that compares the use of a multi-wall immersive environment to a single-wall immersive environment. Users were given a list of products to find in the virtual store. A physical mockup of a shopping cart was created and instrumented in order to be used to navigate throughout the virtual store. The findings indicate that participants in the fivewall immersive environment were significantly faster in locating the objects than the participants using the one-wall immersive environment. In addition, participants in the five-wall condition reported that the shopping cart was easier to use than in the onewall condition. This study indicates that the use of multiple walls to provide an increased sense of immersion improves the ability of consumers to locate items within a virtual shopping experience.
Virtual reality (VR) environments based on interactive rendering of 3D computer graphics often incorporate the use of position and orientation tracking on the user's head, hands, and control devices. The Wii Remote game controller is a massmarket peripheral that can provide a low-cost source of infrared point tracking and accelerometer data, making it attractive as a PC-based virtual reality head tracking system. This paper describes the development of an extension to the Virtual Reality Peripheral Network (VRPN) software to support the use of the Wii Remote game controller as a standard tracker object in a wide range of VR software applications. This implementation permits Wii Remote-based head tracking to directly substitute for more costly commercial trackers through the VRPN and VR Juggler Gadgeteer tracker interfaces. The head tracker provides up to 100Hz of head tracking input. It has been tested in a variety of VR applications on both Windows and Linux. The discussed solution has been released as open-source software. KeywordsVirtual reality, tracking, human-computer interaction ABSTRACTVirtual reality (VR) environments based on interactive rendering of 3D computer graphics often incorporate the use of position and orientation tracking on the user's head, hands, and control devices. The Wii Remote game controller is a massmarket peripheral that can provide a low-cost source of infrared point tracking and accelerometer data, making it attractive as a PC-based virtual reality head tracking system. This paper describes the development of an extension to the Virtual Reality Peripheral Network (VRPN) software to support the use of the Wii Remote game controller as a standard tracker object in a wide range of VR software applications. This implementation permits Wii Remote-based head tracking to directly substitute for more costly commercial trackers through the VRPN and VR Juggler Gadgeteer tracker interfaces. The head tracker provides up to 100Hz of head tracking input. It has been tested in a variety of VR applications on both Windows and Linux. The discussed solution has been released as open-source software.
SatterĄeld, and Srinivas Aluru. Their guidance and input was valuable as I completed the research work performed during my assistantship and the work discussed in this dissertation. I am grateful to my colleagues at the Iowa State University Virtual Reality Applications Center for their assistance with evaluation and feedback on my research. Thanks to Kevin Godby for his contributions to the development of the new, modern L A T E X ISU thesis class used to typeset this document. I would additionally like to thank William McNeely for technical assistance and expertise in ensuring a physically-based simulation platform. I appreciate the support for this work through research assistantships with Judy M.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.