# Publications by Year: Submitted

One-Shot Transfer Learning of Physics-Informed Neural Networks.” In . Publisher's VersionAbstract 2110.11286.pdf

. Submitted. “ . Submitted. “ . Submitted. “Submitted

Shaan Desai, Marios Mattheakis, Hayden Joy, Pavlos Protopapas, and Stephen Roberts. Submitted. “One-Shot Transfer Learning of Physics-Informed Neural Networks.” In . Publisher's VersionAbstract

Solving differential equations efficiently and accurately sits at the heart of progress in many areas of scientific research, from classical dynamical systems to quantum mechanics. There is a surge of interest in using Physics-Informed Neural Networks (PINNs) to tackle such problems as they provide numerous benefits over traditional numerical approaches. Despite their potential benefits for solving differential equations, transfer learning has been under explored. In this study, we present a general framework for transfer learning PINNs that results in one-shot inference for linear systems of both ordinary and partial differential equations. This means that highly accurate solutions to many unknown differential equations can be obtained instantaneously without retraining an entire network. We demonstrate the efficacy of the proposed deep learning approach by solving several real-world problems, such as first- and second-order linear ordinary equations, the Poisson equation, and the time-dependent Schrodinger complex-value partial differential equation.

2110.11286.pdfMarios Mattheakis, David Sondak, Akshunna S. Dogra, and Pavlos Protopapas. Submitted. “Hamiltonian neural networks for solving differential equations”. Publisher's VersionAbstract mattheakis_hnns.pdf

There has been a wave of interest in applying machine learning to study dynamical systems. We present a Hamiltonian neural network that solves the differential equations that govern dynamical systems. This is an equation-driven machine learning method where the optimization process of the network depends solely on the predicted functions without using any ground truth data. The model learns solutions that satisfy, up to an arbitrarily small error, Hamilton's equations and, therefore, conserve the Hamiltonian invariants. The choice of an appropriate activation function drastically improves the predictability of the network. Moreover, an error analysis is derived and states that the numerical errors depend on the overall network performance. The Hamiltonian network is then employed to solve the equations for the nonlinear oscillator and the chaotic Henon-Heiles dynamical system. In both systems, a symplectic Euler integrator requires two orders more evaluation points than the Hamiltonian network in order to achieve the same order of the numerical error in the predicted phase space trajectories.

Marios Mattheakis, Hayden Joy, and Pavlos Protopapas. Submitted. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” In . Publisher's VersionAbstract 2108.11417.pdf

There is a wave of interest in using unsupervised neural networks for solving differential equations. The existing methods are based on feed-forward networks, while recurrent neural network differential equation solvers have not yet been reported. We introduce an nsupervised reservoir computing (RC), an echo-state recurrent neural network capable of discovering approximate solutions that satisfy ordinary differential equations (ODEs). We suggest an approach to calculate time derivatives of recurrent neural network outputs without using backpropagation. The internal weights of an RC are fixed, while only a linear output layer is trained, yielding efficient training. However, RC performance strongly depends on finding the optimal hyper-parameters, which is a computationally expensive process. We use Bayesian optimization to efficiently discover optimal sets in a high-dimensional hyper-parameter space and numerically show that one set is robust and can be used to solve an ODE for different initial conditions and time ranges. A closed-form formula for the optimal output weights is derived to solve first order linear equations in a backpropagation-free learning process. We extend the RC approach by solving nonlinear system of ODEs using a hybrid optimization method consisting of gradient descent and Bayesian optimization. Evaluation of linear and nonlinear systems of equations demonstrates the efficiency of the RC ODE solver.

mariosmat (followed by @g.harvard.edu)

150 Western Ave, Allston, MA 02134

1.312-04, IACS Suite, Harvard.

- Book Chapter (2)
- Conference Paper (9)
- Journal Article (21)
- Magazine Article (1)
- Manuscript (2)
- Thesis (1)

Copyright © 2022 The President and Fellows of Harvard College | Accessibility | Digital Accessibility | Report Copyright Infringement