With the continued technology innovation in object sensing and human motion tracking, the traditional two-dimensional video-based tele-conference systems are projected to evolve into the three-dimensional immersive and augmented reality (AR) based on which one can communicate with the teleported remote other as if present, moving and interacting naturally in the same location. One technical hurdle to this vision is the need to resolve the environmental differences and the resulting teleported avatar motion anomaly between the remote and local sites. This paper presents a novel method to first establish a spatial and object-level match between the remote and local sites and adapts the position and motion of the teleported avatar into the local AR space according to the matched information. This results in a natural looking and spatially correct rendering of the remote user in the local augmented space and a significantly improved tele-conference experience and communication performance.
In this paper, we present a prototype Virtual Welding Simulator, which supports interactive training of welding process by using multimodal interface that can deliver realistic experiences. The goal of this research is to overcome difficult problems by using virtual reality technology in training tasks where welding is treated as a principal manufacturing process. The system design and implementation technical issues are described including real-time simulation and visualization of welding bead, providing realistic experience through 3D multimodal interaction, presenting visual training guides for novice workers, and visual and interactive interface for training assessments. According to the results from an initial user study, the prototype VR based simulator system appears to be helpful for training welding tasks, especially in terms of providing visual training guides and instant training assessment.
Training is one of the representative application fields of virtual reality technology where users can have virtual experience in a training task and working environment. Widely used in the medical and military fields, virtualreality-based training systems are also useful in industrial fields, such as the aerospace industry, since they show superiority over real training environments in terms of accessibility, safety, and cost. The shipbuilding industry is known as a labor-intensive industry that demands a lot of skilled workers. In particular, painting jobs in the shipbuilding industry require a continuous supplement of human resources since many workers leave due to the poor working environment. In this paper, the authors present a virtual-reality-based training system for spray painting tasks in the shipbuilding industry. The design issues and implementation details of the training system are described, and also its advantages and shortcomings are discussed based on use cases in actual work fields.
With continued technological innovations in the fields of mixed reality (MR), wearable type MR devices, such as head-mounted display (HMD), have been released and are frequently used in various fields, such as entertainment, training, education, and shopping. However, because each product has different parts and specifications in terms of design and manufacturing process, users feel that the virtual objects overlaying real environments in MR are visualized differently, depending on the scale and color used by the MR device. In this paper, we compare the effect of scale and color parameters on users’ perceptions in using different types of MR devices to improve their MR experiences in real life. We conducted two experiments (scale and color), and our experimental study showed that the subjects who participated in the scale perception experiment clearly tended to underestimate virtual objects, in comparison with real objects, and overestimated color in MR environments.
In this paper, we present a design evaluation system with visualization and interaction of mobile devices using virtual‐reality‐based prototypes which can be used to easily change design parameters and simulate embedded software. To evaluate and predict affective‐engineering‐based design preferences for mobile devices under a virtual environment, we have developed a high quality visualization platform which creates images that look similar to real mobile devices in addition to real‐time simulation of realistic motions and functions of mobile devices. To support a quantitative usability test scenario for external design shape, we also have built a system which consists of a mixed‐reality‐based testing platform for measuring hand load.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.