Publications by Year: 2021

Shaan Desai, Marios Mattheakis, David Sondak, Pavlos Protopapas, and Stephen Roberts. 9/2021. “Port-Hamiltonian Neural Networks for Learning Explicit Time-Dependent Dynamical Systems.” Phys Rev. E, 104, Pp. 034312. Publisher's VersionAbstract
Accurately learning the temporal behavior of dynamical systems requires models with well-chosen learning biases. Recent innovations embed the Hamiltonian and Lagrangian formalisms into neural networks and demonstrate a significant improvement over other approaches in predicting trajectories of physical systems. These methods generally tackle autonomous systems that depend implicitly on time or systems for which a control signal is known apriori. Despite this success, many real world dynamical systems are non-autonomous, driven by time-dependent forces and experience energy dissipation. In this study, we address the challenge of learning from such non-autonomous systems by embedding the port-Hamiltonian formalism into neural networks, a versatile framework that can capture energy dissipation and time-dependent control forces. We show that the proposed \emph{port-Hamiltonian neural network} can efficiently learn the dynamics of nonlinear physical systems of practical interest and accurately recover the underlying stationary Hamiltonian, time-dependent force, and dissipative coefficient. A promising outcome of our network is its ability to learn and predict chaotic systems such as the Duffing equation, for which the trajectories are typically hard to learn.
Shaan Desai, Marios Mattheakis, and Stephen Roberts. 9/2021. “Variational Integrator Graph Networks for Learning Energy Conserving Dynamical Systems.” Phys. Rev. E, 104, Pp. 035310. Publisher's VersionAbstract
Recent advances show that neural networks embedded with physics-informed priors significantly outperform vanilla neural networks in learning and predicting the long term dynamics of complex physical systems from noisy data. Despite this success, there has only been a limited study on how to optimally combine physics priors to improve predictive performance. To tackle this problem we unpack and generalize recent innovations into individual inductive bias segments. As such, we are able to systematically investigate all possible combinations of inductive biases of which existing methods are a natural subset. Using this framework we introduce Variational Integrator Graph Networks - a novel method that unifies the strengths of existing approaches by combining an energy constraint, high-order symplectic variational integrators, and graph neural networks. We demonstrate, across an extensive ablation, that the proposed unifying framework outperforms existing methods, for data-efficient learning and in predictive accuracy, across both single and many-body problems studied in recent literature. We empirically show that the improvements arise because high order variational integrators combined with a potential energy constraint induce coupled learning of generalized position and momentum updates which can be formalized via the Partitioned Runge-Kutta method.
Tiago A. E. Ferreira, Marios Mattheakis, and Pavlos Protopapas. 2021. “A New Artificial Neuron Proposal with Trainable Simultaneous Local and Global Activation Function”.Abstract
The activation function plays a fundamental role in the artificial neural net-work learning process. However, there is no obvious choice or procedure todetermine the best activation function, which depends on the problem. Thisstudy proposes a new artificial neuron, named Global-Local Neuron, with atrainable activation function composed of two components, a global and alocal. The global component term used here is relative to a mathematicalfunction to describe a general feature present in all problem domains. Thelocal component is a function that can represent a localized behavior, like atransient or a perturbation. This new neuron can define the importance ofeach activation function component in the learning phase. Depending on theproblem, it results in a purely global, or purely local, or a mixed global and local activation function after the training phase. Here, the trigonometric sinefunction was employed for the global component and the hyperbolic tangentfor the local component. The proposed neuron was tested for problems wherethe target was a purely global function, or purely local function, or a com-position of two global and local functions. Two classes of test problems wereinvestigated, regression problems and differential equations solving. The experimental tests demonstrated the Global-Local Neuron network’s superiorperformance, compared with simple neural networks with sine or hyperbolictangent activation function, and with a hybrid network that combines thesetwo simple neural networks.