Shaan Desai, Marios Mattheakis, Hayden Joy, Pavlos Protopapas, and Stephen Roberts. 6/10/2022. “
One-Shot Transfer Learning of Physics-Informed Neural Networks.” In ICML, workshop AI4Science.
Publisher's VersionAbstractSolving differential equations efficiently and accurately sits at the heart of progress in many areas of scientific research, from classical dynamical systems to quantum mechanics. There is a surge of interest in using Physics-Informed Neural Networks (PINNs) to tackle such problems as they provide numerous benefits over traditional numerical approaches. Despite their potential benefits for solving differential equations, transfer learning has been under explored. In this study, we present a general framework for transfer learning PINNs that results in one-shot inference for linear systems of both ordinary and partial differential equations. This means that highly accurate solutions to many unknown differential equations can be obtained instantaneously without retraining an entire network. We demonstrate the efficacy of the proposed deep learning approach by solving several real-world problems, such as first- and second-order linear ordinary equations, the Poisson equation, and the time-dependent Schrodinger complex-value partial differential equation.
2110.11286.pdf Anwesh Bhattacharya, Marios Mattheakis, and Pavlos Protopapas. 2022. “
Encoding Involutory Invariance in Neural Networks.” In IJCNN at IEEE World Congress on Computational Intelligence.
Publisher's VersionAbstract
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries. However, it is not guaranteed that NNs will obey the underlying symmetry unless embedded in the network structure. In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity p = ±1. We develop mathe- matical theorems and propose NN architectures that ensure invariance and universal approximation properties. Numerical experiments indicate that the proposed mod- els outperform baseline networks while respecting the imposed symmetry. An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
2106.12891.pdf Marios Mattheakis, Gabriel R. Schleder, Daniel T. Larson, and Efthimios Kaxiras. 2022. “
First principles physics-informed neural network for quantum wavefunctions and eigenvalue surfaces.” In NeurIPS Workshop on Machine Learning and Physical Sciences. https://arxiv.org/pdf/2211.04607.pdf.
Publisher's VersionAbstract
Physics-informed neural networks have been widely applied to learn general para- metric solutions of differential equations. Here, we propose a neural network to discover parametric eigenvalue and eigenfunction surfaces of quantum systems. We apply our method to solve the hydrogen molecular ion. This is an ab initio deep learning method that solves the Schrödinger equation with the Coulomb potential yielding realistic wavefunctions that include a cusp at the ion positions. The neural solutions are continuous and differentiable functions of the interatomic distance and their derivatives are analytically calculated by applying automatic differentiation. Such a parametric and analytical form of the solutions is useful for further calculations such as the determination of force fields.
2211.04607.pdf Ziyan Zhu, Marios Mattheakis, Weiwei Pan, and Efthimios Kaxiras. 2022. “
HubbardNet: Efficient Predictions of the Bose-Hubbard Model Spectrum with Deep Neural Networks.” In NeurIPS Workshop on Machine Learning and Physical Sciences.
Publisher's VersionAbstract
We present a deep neural network (DNN)-based model, the HubbardNet, to vari- ationally solve for the ground state and excited state wavefunctions of the one- dimensional and two-dimensional Bose-Hubbard model on a square lattice. Using this model, we obtain the Bose-Hubbard energy spectrum as an analytic function of the Coulomb parameter, U , and the total number of particles, N , from a single training, bypassing the need to solve a new hamiltonian for each different input. We show that the DNN-parametrized solutions have excellent agreement with exact di- agonalization while outperforming exact diagonalization in terms of computational scaling, suggesting that our model is promising for efficient, accurate computation of exact phase diagrams of many-body lattice hamiltonians.
Henry Jin, Marios Mattheakis, and Pavlos Protopapas. 2022. “
Physics-Informed Neural Networks for Quantum Eigenvalue Problems.” In IJCNN at IEEE World Congress on Computational Intelligence.
Publisher's VersionAbstractEigenvalue problems are critical to several fields of science and engineering. We expand on the method of using unsupervised neural networks for discovering eigenfunctions and eigenvalues for differential eigenvalue problems. The obtained solutions are given in an analytical and differentiable form that identically satisfies the desired boundary conditions. The network optimization is data-free and depends solely on the predictions of the neural network. We introduce two physics-informed loss functions. The first, called ortho-loss, motivates the network to discover pair-wise orthogonal eigenfunctions. The second loss term, called norm-loss, requests the discovery of normalized eigenfunctions and is used to avoid trivial solutions. We find that embedding even or odd symmetries to the neural network architecture further improves the convergence for relevant problems. Lastly, a patience condition can be used to automatically recognize eigenfunction solutions. This proposed unsupervised learning method is used to solve the finite well, multiple finite wells, and hydrogen atom eigenvalue quantum problems.
2022_pinn_quantum.pdf Raphael Pellegrin, Blake Bullwinkel, Marios Mattheakis, and Pavlos Protopapas. 2022. “
Transfer Learning with Physics-Informed Neural Networks for Efficient Simulation of Branched Flows.” In NeurIPS Workshop on Machine Learning and Physical Sciences.
Publisher's VersionAbstractPhysics-Informed Neural Networks (PINNs) offer a promising approach to solving differential equations and, more generally, to applying deep learning to problems in the physical sciences. We adopt a recently developed transfer learning approach for PINNs and introduce a multi-head model to efficiently obtain accurate solutions to nonlinear systems of ordinary differential equations with random potentials. In particular, we apply the method to simulate stochastic branched flows, a universal phenomenon in random wave dynamics. Finally, we compare the results achieved by feed forward and GAN-based PINNs on two physically relevant transfer learning tasks and show that our methods provide significant computational speedups in comparison to standard PINNs trained from scratch.