Unsupervised Reservoir Computing for Solving Ordinary Differential Equations

Citation:

Marios Mattheakis, Hayden Joy, and Pavlos Protopapas. Forthcoming. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” International Journal on Artificial Intelligence Tools. Publisher's Version Copy at http://tinyurl.com/y8vrlkgb
2108.11417.pdf986 KB

Abstract:

There is a wave of interest in using physics-informed neural networks for solving differen-
tial equations. Most of the existing methods are based on feed-forward networks, while
recurrent neural networks solvers have not been extensively explored. We introduce a
reservoir computing (RC) architecture, an echo-state recurrent neural network capable
of discovering approximate solutions that satisfy ordinary differential equations (ODEs).
We suggest an approach to calculate time derivatives of recurrent neural network outputs
without using back-propagation. The internal weights of an RC are fixed, while only a lin-
ear output layer is trained, yielding efficient training. However, RC performance strongly
depends on finding the optimal hyper-parameters, which is a computationally expensive
process. We use Bayesian optimization to discover optimal sets in a high-dimensional
hyper-parameter space efficiently and numerically show that one set is robust and can
be transferred to solve an ODE for different initial conditions and time ranges. A closed-
form formula for the optimal output weights is derived to solve first-order linear equa-
tions in a one-shot backpropagation-free learning process. We extend the RC approach
by solving nonlinear systems of ODEs using a hybrid optimization method consisting of
gradient descent and Bayesian optimization. Evaluation of linear and nonlinear systems
of equations demonstrates the efficiency of the RC ODE solver.

Last updated on 06/23/2022