2019
DOI: 10.48550/arxiv.1909.13789
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hamiltonian Generative Networks

Abstract: The Hamiltonian formalism plays a central role in classical and quantum physics. Hamiltonians are the main tool for modelling the continuous time evolution of systems with conserved quantities, and they come equipped with many useful properties, like time reversibility and smooth interpolation in time. These properties are important for many machine learning problems -from sequence prediction to reinforcement learning and density modelling -but are not typically provided out of the box by standard tools such a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
46
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 32 publications
(46 citation statements)
references
References 7 publications
0
46
0
Order By: Relevance
“…Also, [Greydanus et al, 2019] used a Hamiltonian prior on a NN and studied conservative systems such as a mass-spring system and a pendulum (ideal and real data). [Toth et al, 2019] introduced Hamiltonian Generative Networks to model conserved densities satisfying Hamiltonian dynamics, which has applications towards image analysis, for example.…”
Section: Related Workmentioning
confidence: 99%
“…Also, [Greydanus et al, 2019] used a Hamiltonian prior on a NN and studied conservative systems such as a mass-spring system and a pendulum (ideal and real data). [Toth et al, 2019] introduced Hamiltonian Generative Networks to model conserved densities satisfying Hamiltonian dynamics, which has applications towards image analysis, for example.…”
Section: Related Workmentioning
confidence: 99%
“…Such regularizers are based on physical conservation principles or the governing equations of the problems themselves and can be used for both improving predictions [21] or approximating PDEs [22,23]. The other category consists of modifying the ML/DL architecture to incorporate physical information by adding intermediate physical variables to conventional networks [24,25], by encoding invariances and symmetries within the architecture [26,27,28,29,30,31,32], and by providing other domain-specific physical knowledge, which does not correspond to known invariances or symmetries but provides meaningful structure to the optimization process [33,34,35,36].…”
Section: Introductionmentioning
confidence: 99%
“…Modern machine learning techniques without any inductive bias often fail in this task because the networks do not generalize well and lack robustness. To overcome these problems, we therefore combine latest advances in physics-enhanced Neural Networks based on dynamic invariants such as the Hamiltonian [1,2,3] or Lagrangian [4,5] NNs with additional inductive bias to gain better predictive capabilities and to reduce the requisite training data. We propose a novel regularization term that can be readily added to the training loss function of most machine learning frameworks and show that it leads to predictive performance and data efficiency that outperforms the original framework for all tested examples and under initial conditions not contained in the training data.…”
Section: Introductionmentioning
confidence: 99%