Product visualization in AR/VR applications requires a largely manual process of data preparation. Previous publications focus on error-free triangulation or transformation of product structure data and display attributes for AR/VR applications. This paper focuses on the preparation of the required geometry data. In this context, a significant reduction in effort can be achieved through automation. The steps of geometry preparation are identified and examined concerning their automation potential. In addition, possible couplings of sub-steps are discussed. Based on these explanations, a structure for the geometry preparation process is proposed. With this structured preparation process, it becomes possible to consider the available computing power of the target platform during the geometry preparation. The number of objects to be rendered, the tessellation quality, and the level of detail can be controlled by the automated choice of transformation parameters. Through this approach, tedious preparation tasks and iterative performance optimization can be avoided in the future, which also simplifies the integration of AR/VR applications into product development and use. A software tool is presented in which partial steps of the automatic preparation are already implemented. After an analysis of the product structure of a CAD file, the transformation is executed for each component. Functions implemented so far allow, for example, the selection of assemblies and parts based on filter options, the transformation of geometries in batch mode, the removal of certain details, and the creation of UV maps. Flexibility, transformation quality, and timesavings are described and discussed.
AR/VR applications are a valuable tool in product development and the overall product lifecycle in engineering. However, data transformation of the models from CAD systems to the AR/VR applications is labor-intensive and requires expertise. The main task in the data transformation is the tessellation of the product geometry. Depending on the product complexity and the performance of the target platform extensive optimization is needed to ensure the usability and visual quality of the AR/VR application. Current approaches to this problem use iterative and inflexible processes mostly based on tessellation and on mesh decimation that ignore the varying importance of different geometric aspects for an AR/VR application. An alternative respectively more targeted approach is proposed, that aims at predicting tessellation results and moving the optimization process before the actual tessellation. As a result, the need for iterative operations on the polygon meshes can be reduced or ideally avoided altogether. The paper presents some results of an investigation of the hypothesis that geometric complexity metrics can be used to control and enhance the choice of tessellation parameters. Several characteristics and metrics are identified and gathered from literature and subsequently evaluated with regard to the polygon count and visual quality in the geometry preparation process. Based on the evaluation, prediction models are created and implemented in a geometry preparation tool. The performance is evaluated and discussed.
Virtual testing is a significant part of the product development process. It is possible to completely solve many problems through the interaction of geometric models, simulation tools, human models with the help of the appropriate software. If, in the course of testing, it is necessary to take into account subjective human perception, one may profitably use a full-size system for immersive projection (VR system). Such a system particularly makes sense in evaluating manufacturing, operating, application or maintenance. The human being interacts with a product whose physical shape does not yet exist in a virtual environment. In this case, the movements of product components are generally indirectly controlled by using a flystick, a wand or a similar input device. In reality many operations in maintenance are determined by the position and posture of the maintenance personal as well as by the mass, center of gravity and dimensions of the object to be manipulated. In a Mixed Reality Environment, it is possible to achieve a meaningful subjective ergonomic evaluation of the abovementioned operations. The paper elucidates a strategy to integrate real product components into a virtual environment. The user applies the real components or tools in the immersive full-size projection of the VR system. The VR system tracks the real object. This way, it is possible to move an invisible object model in the VR system in sync with the movements of the real object. The collision detection tool provided in the VR system is available and signalises contact of the real object with the virtual environment. The demonstrated solution is under consideration for the planning and ergonomic evaluation of service activities. The need of industry for a process that can be controlled in a safe manner is of particular concern. The solution given here is aimed at maintenance to be performed on the brake system of a light-duty truck.
A high variety of peripherals for the interaction between human and computer is available (e.g. Mouse, Touch and Camera). Therefore the peripherals communicate in different ways with the computer and its applications. The library VRPN is a common, generalized interface between peripherals and VR applications to reduce the development effort. Its main advantages are the system independent client-server architecture with real-time capability and the easy implementation of new peripheral devices.The proposed paper describes the adaption and extension of the VRPN concept to address the challenges of engineering like modeling, evaluation, simulation and modification. Innovative interaction devices have the capability to enhance engineering applications with comparatively small effort but great benefit. As an example a VRPN client is implemented into the CAD application SolidWorks. This enables the use of any interaction device which is supported by VRPN. For example, the designer can control the model view by human movement via tracking device like the Microsoft Kinect or the Geomagic Touch.The data transfer can be either established in a synchronous or in an asynchronous manner. Regarding synchronous transfer, the server-client architecture was implemented in different applications (e.g. CAD, VR). In order to realize a time shifted asynchronous transfer a recorder-player middleware was developed.
The paper elucidates a process integration method in which the virtual reality (VR) application moves beyond being just a visualisation tool to functioning as an interface for collaboration. The central module of this system is the so-called session manager. The session manager co-ordinates the collaboration of several authorised individuals and various software tools all working together on one development project. The session manager is at the centre of the product data management (PDM) system managing product structure, product documents and project state. The associated VR system synchronously presents the current state of the project. It is thus possible to visualise changes in geometry, appearance or structure in real time. In this way, VR can be integrated into a heterogeneous system consisting of CAD workstations and simulation applications. Integration is based on a bi-directional link among all applications via the PDM system. Thus, it is possible to use and represent not only the product’s design and structure, but also its non-geometric information using PDM. Additionally, the persons working with it may select product components in VR and assign tasks to workstations or persons, since, in VR, relevant information from the PDM is available.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.