Physical Symmetries Embedded in Neural Networks

Citation:

M.Mattheakis, P. Protopapas, D. Sondak, M. Di Giovanni, and E. Kaxiras. 4/2019. “Physical Symmetries Embedded in Neural Networks.” arXiv paper, 1904.08991. Publisher's Version Copy at https://tinyurl.com/y8soxkuo

Abstract:

Neural networks are a central technique in machine learning. Recent years have seen a wave of interest in applying neural networks to physical systems for which the governing dynamics are known and expressed through differential equations. Two fundamental challenges facing the development of neural networks in physics applications is their lack of interpretability and their physics-agnostic design. The focus of the present work is to embed physical constraints into the structure of the neural network to address the second fundamental challenge. By constraining tunable parameters (such as weights and biases) and adding special layers to the network, the desired constraints are guaranteed to be  satisfied without the need for explicit regularization terms. This is demonstrated on  supervised and unsupervised networks for two basic symmetries: even/odd symmetry of a function and energy conservation. In the supervised case, the network with embedded constraints is shown to perform well on regression problems while simultaneously obeying the desired constraints whereas a traditional network fits the data but violates the underlying  constraints. Finally, a new unsupervised neural network is proposed that guarantees energy conservation through an embedded symplectic structure. The symplectic neural network is used to solve a system of energy-conserving differential equations and out-performs an  unsupervised, non-symplectic neural network.
Last updated on 01/29/2020