Abstract-We present a novel physically-based approach for simulating realistic brittle fracture of impacting bodies in real-time. Our method is mainly composed of two novel parts: (1) a fracture initiation method based on modal analysis, (2) a fast energybased fracture propagation algorithm. We propose a way to compute the contact durations and the contact forces between stiff bodies to simulate the damped deformation wave that is responsible for fracture initiation. As a consequence, our method naturally takes into account the damping properties of the bodies as well as the contact properties to simulate the fracture. To obtain a complete fracture pipeline, we present an efficient way to generate the fragments and their geometric surfaces. These surfaces are sampled on the edges of the physical mesh, to visually represent the actual fracture surface computed. As shown in our results, the computation time performances and realism of our method are well-suited for physically-based interactive applications.
Abstract. In this paper we introduce the combined use of BrainComputer Interfaces (BCI) and Haptic interfaces. We propose to adapt haptic guides based on the mental activity measured by a BCI system. This novel approach is illustrated within a proof-of-concept system: haptic guides are toggled during a path-following task thanks to a mental workload index provided by a BCI. The aim of this system is to provide haptic assistance only when the user's brain activity reflects a high mental workload. A user study conducted with 8 participants shows that our proof-of-concept is operational and exploitable. Results show that activation of haptic guides occurs in the most difficult part of the pathfollowing task. Moreover it allows to increase task performance by 53% by activating assistance only 59% of the time. Taken together, these results suggest that BCI could be used to determine when the user needs assistance during haptic interaction and to enable haptic guides accordingly.
A common weathering effect is the appearance of cracks due to material fractures. Previous exemplar‐based aging and weathering methods have either reused images or sought to replicate observed patterns exactly. We introduce a new approach to exemplar‐based modeling that creates weathered patterns on synthetic objects by matching the statistics of fracture patterns in a photograph. We present a user study to determine which statistics are correlated to visual similarity and how they are perceived by the user. We then describe a revised physically‐based fracture model capable of producing a wide range of crack patterns at interactive rates. We demonstrate how a Bayesian optimization method can determine the parameters of this model so it can produce a pattern with the same key statistics as an exemplar. Finally, we present results using our approach and various exemplars to produce a variety of fracture effects in synthetic renderings of complex environments. The speed of the fracture simulation allows interactive previews of the fractured results and its application on large scale environments.
Haptic rendering has opened a new range of virtual reality applications, enabling a human user to interact with a virtual world using the sense of touch. This kind of interaction enables to enhance applications such as computer-assisted design, where 3D manipulations are part of the system. However, building an application with an accurate haptic feedback is still challenging, especially for interactions between rigid bodies, where stiff contacts can only be displayed with a high simulation frequency. This paper presents the possibilities of implementation of a modular haptic display system that relies on two main components: a physical simulation part and a haptic rendering part. For that purpose, we define a generic coupling approach that enables to perform haptic rendering using admittance haptic devices, through a scaling interface that cleanly separates the physical simulation and the haptic rendering system of units. Four physical simulation libraries are evaluated with respect to haptic rendering quality criteria, based on their behavior in four discriminant test cases. We show that the proposed approach leads to a modular, generic and stable haptic application.
In complex scenes with many objects, collision detection plays a key role in the simulation performance. This is particularly true in fracture simulation for two main reasons. One is that fracture fragments tend to exhibit very intensive contact, and the other is that collision detection data structures for new fragments need to be computed on the fly. In this paper, we present novel collision detection algorithms and data structures for real-time simulation of fracturing rigid bodies. We build on a combination of well-known efficient data structures, namely, distance fields and sphere trees, making our algorithm easy to integrate on existing simulation engines. We propose novel methods to construct these data structures, such that they can be efficiently updated upon fracture events and integrated in a simple yet effective self-adapting contact selection algorithm. Altogether, we drastically reduce the cost of both collision detection and collision response. We have evaluated our global solution for collision detection on challenging scenarios, achieving high frame rates suited for hard real-time applications such as video games or haptics. Our solution opens promising perspectives for complex fracture simulations involving many dynamically created rigid objects.
Abstract. In this paper we introduce the combined use of BrainComputer Interfaces (BCI) and Haptic interfaces. We propose to adapt haptic guides based on the mental activity measured by a BCI system. This novel approach is illustrated within a proof-of-concept system: haptic guides are toggled during a path-following task thanks to a mental workload index provided by a BCI. The aim of this system is to provide haptic assistance only when the user's brain activity reflects a high mental workload. A user study conducted with 8 participants shows that our proof-of-concept is operational and exploitable. Results show that activation of haptic guides occurs in the most difficult part of the pathfollowing task. Moreover it allows to increase task performance by 53% by activating assistance only 59% of the time. Taken together, these results suggest that BCI could be used to determine when the user needs assistance during haptic interaction and to enable haptic guides accordingly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.