Hamiltonian Graph Networks with ODE Integrators

Sep 27, 2019
10 pages
e-Print:

Citations per year

201920202021202220231324
Abstract: (submitter)
We introduce an approach for imposing physically informed inductive biases in learned simulation models. We combine graph networks with a differentiable ordinary differential equation integrator as a mechanism for predicting future states, and a Hamiltonian as an internal representation. We find that our approach outperforms baselines without these biases in terms of predictive accuracy, energy accuracy, and zero-shot generalization to time-step sizes and integrator orders not experienced during training. This advances the state-of-the-art of learned simulation, and in principle is applicable beyond physical domains.
  • [1]
    Peter Battaglia, Razvan Pascanu, Matthew Lai, Danilo Rezende, and Koray Kavukcuoglu. Interaction networks for learning about objects, relations and physics. In Proceedings of the 30th Conference on Neural Information Processing Systems (NeurIPS)
  • [2]
    Tomer Ullman, Antonio Torralba, and Joshua B. Tenenbaum. A compositional object-based approach to learning physical dynamics. In Proceedings of the 5th International Conference on Learning Representations (ICLR)
    • Chang
      ,
    • Michael B.
  • [3]
    Alvaro Sanchez-Gonzalez, Nicolas Heess, Jost Tobias Springenberg, Josh Merel, Martin Riedmiller, Raia Hadsell, and Peter Battaglia. Graph networks as learnable physics engines for inference and control. In Proceedings of the 35th International Conference on Machine Learning (ICML)
  • [4]
    Damian Mrowca, Chengxu Zhuang, Elias Wang, Nick Haber, Li Fei-Fei / Flexible neural representation for physics prediction. In Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS)
    • Tenenbaum
      ,
    • Joshua B.
      ,
    • Yamins
      ,
    • Daniel L.K.
  • [5]
    Yunzhu Li, Jiajun Wu, Russ Tedrake / and Antonio Torralba. Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids. arXiv preprint
    • Tenenbaum
      ,
    • Joshua B.
  • [6]
    Sungyong Seo and Yan Liu. Differentiable physics-informed graph networks. arXiv preprint
  • [7]
    Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. Neural ordinary differential equations. In Advances in neural information processing systems, pages 6571-6583
  • [8]
    Yulia Rubanova / and David Duvenaud. Latent odes for irregularly-sampled time series. CoRR, abs/ / URL
    • Chen
      ,
    • Ricky T.Q.
  • [9]
    Sam Greydanus, Misko Dzamba, and Jason Yosinski. Hamiltonian neural networks. CoRR, abs/ / URL
  • [11]
    A symplectic integration algorithm for separable hamiltonian functions
    • J. Candy
      ,
    • W. Rozmus
      • J.Comput.Phys. 92 (1991) 230-256
  • [12]
    Test of 3rd-order vs 4th-order symplectic integrator with strange result. Computational Science Stack Exchange,. URL / generation A.1 Initial / conditions Each particle i had initial conditions (all in figures in international system of units) drawn from independent uniform distributions: • Mass mi ∈ [0.1, 1] • Spring constant ki ∈ [0.5, 1] • Initial position qi 0 ∈ [-1, 1]2 • Initial velocity vi 0 ∈ [-3, 3]2, pi 0 = mi vi 0 A.2 Physics simulator Our dataset consists of simulations for particle-spring systems where particle j exerts a force on particle i (Hooke’s force) Fij = -kij · (qi - qj ) where kij
    • A. Data