2016
DOI: 10.1002/cav.1712
|View full text |Cite
|
Sign up to set email alerts
|

Predictable behavior during contact simulation: a comparison of selected physics engines

Abstract: Contact behaviors in physics simulations are important for real-time interactive applications, especially in virtual reality applications where user's body parts are tracked and interact with the environment via contact. For these contact simulations, it is ideal to have small changes in initial condition yield predictable changes in the output. Predictable simulation is key for success in iterative learning processes as well, such as learning controllers for manipulations or locomotion tasks. Here, we present… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(9 citation statements)
references
References 9 publications
(14 reference statements)
0
8
0
Order By: Relevance
“…Boeing et al [23] compared PhysX, Bullet, JigLib, Newton, Open Dynamics Engine (ODE), Tokamak and True Axis; they reported that Bullet performed best overall however no physics engine was best at all tasks. Chung et al [24] likewise found when testing Bullet, Dynamic Animation and Robotics Toolkit (DART), MuJoCo, and ODE, that no one engine performed better at all tasks, stating that for different tasks and different conditions a different physics engine was found to be better. These findings are further corroborated by Gonzalez-Badillo et al [25], who showed that PhysX performs better than Bullet for non-complex geometries but is unable to simulate more complex geometries to the same degree as Bullet.…”
Section: B Physics Enginesmentioning
confidence: 98%
“…Boeing et al [23] compared PhysX, Bullet, JigLib, Newton, Open Dynamics Engine (ODE), Tokamak and True Axis; they reported that Bullet performed best overall however no physics engine was best at all tasks. Chung et al [24] likewise found when testing Bullet, Dynamic Animation and Robotics Toolkit (DART), MuJoCo, and ODE, that no one engine performed better at all tasks, stating that for different tasks and different conditions a different physics engine was found to be better. These findings are further corroborated by Gonzalez-Badillo et al [25], who showed that PhysX performs better than Bullet for non-complex geometries but is unable to simulate more complex geometries to the same degree as Bullet.…”
Section: B Physics Enginesmentioning
confidence: 98%
“…For a comparison of physics engines, we refer the reader to two recent studies (Erez et al, 2015 ; Chung and Pollard, 2016 ). Erez et al ( 2015 ) compared ODE, Bullet, PhysX, Havok, and MuJoCo.…”
Section: Analytical Approachesmentioning
confidence: 99%
“…For robotics, this is MuJoCo while gaming engines shine in gaming-related trials, whereby no engine emerges as a clear winner. Chung and Pollard ( 2016 ) compared Bullet, DART, MuJoCo, and ODE with regard to contact simulations whilst focusing on the predictability of behavior. Their main result is that the surveyed engines are sensitive to small changes in initial conditions, emphasizing that parameter tuning is important.…”
Section: Analytical Approachesmentioning
confidence: 99%
“…Commonly, test scenes are developed that are designed to assess elements of the physics engine that are likely to deviate from the real world. Such tests evaluate the numerical integration, constraint stability, collision detection and material properties of the physics engine [17], [18], [19]. Tests are evaluated with reference to the real world behaviour by drawing comparisons from the human expectation of the system, or from the derivation of behaviour from first principles.…”
Section: Related Workmentioning
confidence: 99%