Publications by Type: Manuscript

Marios Mattheakis, David Sondak, Akshunna S. Dogra, and Pavlos Protopapas. Submitted. “Hamiltonian neural networks for solving differential equations”. Publisher's VersionAbstract


There has been a wave of interest in applying machine learning to study dynamical systems. We present a Hamiltonian neural network that solves the differential equations that govern dynamical systems. This is an equation-driven machine learning method where the optimization process of the network depends solely on the predicted functions without using any ground truth data. The model learns solutions that satisfy, up to an arbitrarily small error, Hamilton's equations and, therefore, conserve the Hamiltonian invariants. The choice of an appropriate activation function drastically improves the predictability of the network. Moreover, an error analysis is derived and states that the numerical errors depend on the overall network performance. The Hamiltonian network is then employed to solve the equations for the nonlinear oscillator and the chaotic Henon-Heiles dynamical system. In both systems, a symplectic Euler integrator requires two orders more evaluation points than the Hamiltonian network in order to achieve the same order of the numerical error in the predicted phase space trajectories.


Tiago A. E. Ferreira, Marios Mattheakis, and Pavlos Protopapas. 2021. “A New Artificial Neuron Proposal with Trainable Simultaneous Local and Global Activation Function”.Abstract
The activation function plays a fundamental role in the artificial neural net-work learning process. However, there is no obvious choice or procedure todetermine the best activation function, which depends on the problem. Thisstudy proposes a new artificial neuron, named Global-Local Neuron, with atrainable activation function composed of two components, a global and alocal. The global component term used here is relative to a mathematicalfunction to describe a general feature present in all problem domains. Thelocal component is a function that can represent a localized behavior, like atransient or a perturbation. This new neuron can define the importance ofeach activation function component in the learning phase. Depending on theproblem, it results in a purely global, or purely local, or a mixed global and local activation function after the training phase. Here, the trigonometric sinefunction was employed for the global component and the hyperbolic tangentfor the local component. The proposed neuron was tested for problems wherethe target was a purely global function, or purely local function, or a com-position of two global and local functions. Two classes of test problems wereinvestigated, regression problems and differential equations solving. The experimental tests demonstrated the Global-Local Neuron network’s superiorperformance, compared with simple neural networks with sine or hyperbolictangent activation function, and with a hybrid network that combines thesetwo simple neural networks.