Evaluation of real-time physics simulation systems
Boeing, A., & Bräunl, T. (2007). Evaluation of real-time physics simulation systems. Proceedings Of the 5th International Conference on Computer Graphics and Interactive Techniques in Australia and Southeast Asia, 1(212). doi:10.1145/1321261.1321312
Summary
In this paper Boeing & Bräunl evaluate and compare a number of game physics engines. They do this through the Physics Abstraction Layer (PAL), which is a common layer for multiple physics engines developed by the first author. They perform a number of tests:
- Integrator test — looking at the integrator error. All of them have some error because they are by definition approximations.
- Bounce height test — looking at how high a ball bounces based on different coefficients of restitution. Most of the engines have behavior that depends on the restitution, though some (e.g. ODE) don’t because they resolve collisions using repulsive forces rather than impulse based methods (which conserve momentum)
- Friction test — looking at what angle a box starts moving on an incline plane as a function of the coefficient of friction. There is significant variability here, though all engines require a larger angle for larger coefficients.
- Constraint stability test — looking at how stable constraints are (i.e. whether they allow objects to drift apart or not). For all engines the error increased as a function of the number constraints (linked together in a chain).
- Collision test — looking at whether objects pass through each other after a collision, and how much interpenetration error there is. Several engines failed this test, allowing objects to pass through. Bullet was the only engine to converge on zero interpenetration error, though with larger time steps Bullet also failed the test.
- Stacking test with boxes — looking at whether a stack of boxes should fall over, inspected visually. All engines passed.
- Stacking test with spheres — looking at whether a stack of spheres should fall over, inspected visually. All engines failed (the spheres stayed stacked on top of each other)
Takeaways
It’s a little hard to interpret these results since all the physics engines can be tweaked in various ways depending on what you want them to do. Also, the last test—stacking with spheres—seems a bit weird to me. Physically, the results they got are accurate; the authors complained that the engines should have added noise in order to make the scenario realistic. I disagree: for a physics engine, you want reproducible results. If, as a programmer, you want the engine to incorporate noise you can add it yourself. It’s not even clear here what source of noise they would prefer the engines to implement. (Positional noise? Small random perturbations?)