%0 Journal Article %J Phys. Rev. E %D 2022 %T Hamiltonian neural networks for solving equations of motion %A Marios Mattheakis %A David Sondak %A Akshunna S. Dogra %A Pavlos Protopapas %X

There has been a wave of interest in applying machine learning to study dynamical systems. We present a Hamiltonian neural network that solves the differential equations that govern dynamical systems. This is an equation-driven machine learning method where the optimization process of the network depends solely on the predicted functions without using any ground truth data. The model learns solutions that satisfy, up to an arbitrarily small error, Hamilton’s equations and, therefore, conserve the Hamiltonian invariants. The choice of an appropriate activation function drastically improves the predictability of the network. Moreover, an error analysis is derived and states that the numerical errors depend on the overall network performance. The Hamiltonian network is then employed to solve the equations for the nonlinear oscillator and the chaotic H ́enon-Heiles dynamical system. In both systems, a symplectic Euler integrator requires two orders more evaluation points than the Hamiltonian network in order to achieve the same order of the numerical error in the predicted phase space trajectories.

 

%B Phys. Rev. E %V 105 %P 065305 %G eng %U https://journals.aps.org/pre/abstract/10.1103/PhysRevE.105.065305 %0 Conference Paper %B ICML, workshop AI4Science %D 2022 %T One-Shot Transfer Learning of Physics-Informed Neural Networks %A Shaan Desai %A Marios Mattheakis %A Hayden Joy %A Pavlos Protopapas %A Stephen Roberts %X Solving differential equations efficiently and accurately sits at the heart of progress in many areas of scientific research, from classical dynamical systems to quantum mechanics. There is a surge of interest in using Physics-Informed Neural Networks (PINNs) to tackle such problems as they provide numerous benefits over traditional numerical approaches. Despite their potential benefits for solving differential equations, transfer learning has been under explored. In this study, we present a general framework for transfer learning PINNs that results in one-shot inference for linear systems of both ordinary and partial differential equations. This means that highly accurate solutions to many unknown differential equations can be obtained instantaneously without retraining an entire network. We demonstrate the efficacy of the proposed deep learning approach by solving several real-world problems, such as first- and second-order linear ordinary equations, the Poisson equation, and the time-dependent Schrodinger complex-value partial differential equation. %B ICML, workshop AI4Science %G eng %U https://arxiv.org/abs/2110.11286 %0 Journal Article %J Chaos, Solitons and Fractals %D 2022 %T Modeling the effect of the vaccination campaign on the Covid-19 pandemic %A Mattia Angelia %A Georgios Neofotistos %A Marios Mattheakis %A Efthimios Kaxiras %X

Population-wide vaccination is critical for containing the SARS-CoV-2 (Covid-19) pandemic when combined with restrictive and prevention measures. In this study we introduce SAIVR, a mathematical model able to forecast the Covid-19 epidemic evolution during the vaccination campaign. SAIVR extends the widely used Susceptible-Infectious-Removed (SIR) model by considering the Asymptomatic (A) and Vaccinated (V) compartments. The model contains sev- eral parameters and initial conditions that are estimated by employing a semi-supervised machine learning procedure. After training an unsupervised neural network to solve the SAIVR differ- ential equations, a supervised framework then estimates the optimal conditions and parameters that best fit recent infectious curves of 27 countries. Instructed by these results, we performed an extensive study on the temporal evolution of the pandemic under varying values of roll-out daily rates, vaccine efficacy, and a broad range of societal vaccine hesitancy/denial levels. The concept of herd immunity is questioned by studying future scenarios which involve different vaccination efforts and more infectious Covid-19 variants.

%B Chaos, Solitons and Fractals %V 154 %P 111621 %8 2021 %G eng %U https://www.sciencedirect.com/science/article/pii/S0960077921009759?via%3Dihub %0 Conference Paper %B IJCNN at IEEE World Congress on Computational Intelligence %D 2022 %T Encoding Involutory Invariance in Neural Networks %A Anwesh Bhattacharya %A Marios Mattheakis %A Pavlos Protopapas %X

In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries. However, it is not guaranteed that NNs will obey the underlying symmetry unless embedded in the network structure. In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity p = ±1. We develop mathe- matical theorems and propose NN architectures that ensure invariance and universal approximation properties. Numerical experiments indicate that the proposed mod- els outperform baseline networks while respecting the imposed symmetry. An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.

%B IJCNN at IEEE World Congress on Computational Intelligence %G eng %U https://arxiv.org/pdf/2106.12891.pdf %0 Conference Paper %B NeurIPS Workshop on Machine Learning and Physical Sciences %D 2022 %T First principles physics-informed neural network for quantum wavefunctions and eigenvalue surfaces %A Marios Mattheakis %A Gabriel R. Schleder %A Larson, Daniel T. %A Efthimios Kaxiras %X

Physics-informed neural networks have been widely applied to learn general para- metric solutions of differential equations. Here, we propose a neural network to discover parametric eigenvalue and eigenfunction surfaces of quantum systems. We apply our method to solve the hydrogen molecular ion. This is an ab initio deep learning method that solves the Schrödinger equation with the Coulomb potential yielding realistic wavefunctions that include a cusp at the ion positions. The neural solutions are continuous and differentiable functions of the interatomic distance and their derivatives are analytically calculated by applying automatic differentiation. Such a parametric and analytical form of the solutions is useful for further calculations such as the determination of force fields.

%B NeurIPS Workshop on Machine Learning and Physical Sciences %I https://arxiv.org/pdf/2211.04607.pdf %8 2022 %G eng %U https://ml4physicalsciences.github.io/2022/files/NeurIPS_ML4PS_2022_71.pdf %0 Conference Paper %B NeurIPS Workshop on Machine Learning and Physical Sciences %D 2022 %T HubbardNet: Efficient Predictions of the Bose-Hubbard Model Spectrum with Deep Neural Networks %A Zhu, Ziyan %A Marios Mattheakis %A Weiwei Pan %A Efthimios Kaxiras %X

We present a deep neural network (DNN)-based model, the HubbardNet, to vari- ationally solve for the ground state and excited state wavefunctions of the one- dimensional and two-dimensional Bose-Hubbard model on a square lattice. Using this model, we obtain the Bose-Hubbard energy spectrum as an analytic function of the Coulomb parameter, U , and the total number of particles, N , from a single training, bypassing the need to solve a new hamiltonian for each different input. We show that the DNN-parametrized solutions have excellent agreement with exact di- agonalization while outperforming exact diagonalization in terms of computational scaling, suggesting that our model is promising for efficient, accurate computation of exact phase diagrams of many-body lattice hamiltonians.

%B NeurIPS Workshop on Machine Learning and Physical Sciences %8 2022 %G eng %U https://ml4physicalsciences.github.io/2022/files/NeurIPS_ML4PS_2022_87.pdf %0 Conference Paper %B IJCNN at IEEE World Congress on Computational Intelligence %D 2022 %T Physics-Informed Neural Networks for Quantum Eigenvalue Problems %A Henry Jin %A Marios Mattheakis %A Pavlos Protopapas %X Eigenvalue problems are critical to several fields of science and engineering. We expand on the method of using unsupervised neural networks for discovering eigenfunctions and eigenvalues for differential eigenvalue problems. The obtained solutions are given in an analytical and differentiable form that identically satisfies the desired boundary conditions. The network optimization is data-free and depends solely on the predictions of the neural network. We introduce two physics-informed loss functions. The first, called ortho-loss, motivates the network to discover pair-wise orthogonal eigenfunctions. The second loss term, called norm-loss, requests the discovery of normalized eigenfunctions and is used to avoid trivial solutions. We find that embedding even or odd symmetries to the neural network architecture further improves the convergence for relevant problems. Lastly, a patience condition can be used to automatically recognize eigenfunction solutions. This proposed unsupervised learning method is used to solve the finite well, multiple finite wells, and hydrogen atom eigenvalue quantum problems. %B IJCNN at IEEE World Congress on Computational Intelligence %G eng %U https://arxiv.org/abs/2203.00451 %0 Journal Article %J arXiv paper %D 2022 %T RcTorch: a PyTorch Reservoir Computing Package with Automated Hyper-Parameter Optimization %A Hayden Joy %A Marios Mattheakis %A Pavlos Protopapas %X

Reservoir computer (RC) are among the fastest to train of all neural networks, especially when they are compared to other recurrent neural networks. RC has this advantage while still handling sequential data exceptionally well. However, RC adoption has lagged other neural network models because of the model’s sensitivity to its hyper-parameters (HPs). A modern unified software package that automatically tunes these parameters is missing from the literature. Manually tuning these numbers is very difficult, and the cost of traditional grid search methods grows exponentially with the number of HPs considered, discouraging the use of the RC and limiting the complexity of the RC models which can be devised. We address these problems by introducing RcTorch, a PyTorch based RC neural network package with automated HP tuning. Herein, we demonstrate the utility of RcTorchby using it to predict the complex dynamics of a driven pendulum being acted upon by varying forces. This work includes coding examples. Example Python Jupyter notebooks can be found on our GitHub repository https://github.com/blindedjoy/RcTorch and documentation can be found at https://rctorch.readthedocs.io/.

%B arXiv paper %G eng %U https://arxiv.org/pdf/2207.05870.pdf %0 Conference Paper %B NeurIPS Workshop on Machine Learning and Physical Sciences %D 2022 %T Transfer Learning with Physics-Informed Neural Networks for Efficient Simulation of Branched Flows %A Raphael Pellegrin %A Blake Bullwinkel %A Marios Mattheakis %A Pavlos Protopapas %X Physics-Informed Neural Networks (PINNs) offer a promising approach to solving differential equations and, more generally, to applying deep learning to problems in the physical sciences. We adopt a recently developed transfer learning approach for PINNs and introduce a multi-head model to efficiently obtain accurate solutions to nonlinear systems of ordinary differential equations with random potentials. In particular, we apply the method to simulate stochastic branched flows, a universal phenomenon in random wave dynamics. Finally, we compare the results achieved by feed forward and GAN-based PINNs on two physically relevant transfer learning tasks and show that our methods provide significant computational speedups in comparison to standard PINNs trained from scratch. %B NeurIPS Workshop on Machine Learning and Physical Sciences %8 2022 %G eng %U https://ml4physicalsciences.github.io/2022/files/NeurIPS_ML4PS_2022_181.pdf