Publications by Type: Conference Paper

2022
Shaan Desai, Marios Mattheakis, Hayden Joy, Pavlos Protopapas, and Stephen Roberts. 6/10/2022. “One-Shot Transfer Learning of Physics-Informed Neural Networks.” In ICML, workshop AI4Science. Publisher's VersionAbstract
Solving differential equations efficiently and accurately sits at the heart of progress in many areas of scientific research, from classical dynamical systems to quantum mechanics. There is a surge of interest in using Physics-Informed Neural Networks (PINNs) to tackle such problems as they provide numerous benefits over traditional numerical approaches. Despite their potential benefits for solving differential equations, transfer learning has been under explored. In this study, we present a general framework for transfer learning PINNs that results in one-shot inference for linear systems of both ordinary and partial differential equations. This means that highly accurate solutions to many unknown differential equations can be obtained instantaneously without retraining an entire network. We demonstrate the efficacy of the proposed deep learning approach by solving several real-world problems, such as first- and second-order linear ordinary equations, the Poisson equation, and the time-dependent Schrodinger complex-value partial differential equation.
2110.11286.pdf
Anwesh Bhattacharya, Marios Mattheakis, and Pavlos Protopapas. 2022. “Encoding Involutory Invariance in Neural Networks.” In IJCNN at IEEE World Congress on Computational Intelligence. Publisher's VersionAbstract

In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries. However, it is not guaranteed that NNs will obey the underlying symmetry unless embedded in the network structure. In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity p = ±1. We develop mathe- matical theorems and propose NN architectures that ensure invariance and universal approximation properties. Numerical experiments indicate that the proposed mod- els outperform baseline networks while respecting the imposed symmetry. An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.

2106.12891.pdf
Marios Mattheakis, Gabriel R. Schleder, Daniel T. Larson, and Efthimios Kaxiras. 2022. “First principles physics-informed neural network for quantum wavefunctions and eigenvalue surfaces.” In NeurIPS Workshop on Machine Learning and Physical Sciences. https://arxiv.org/pdf/2211.04607.pdf. Publisher's VersionAbstract

Physics-informed neural networks have been widely applied to learn general para- metric solutions of differential equations. Here, we propose a neural network to discover parametric eigenvalue and eigenfunction surfaces of quantum systems. We apply our method to solve the hydrogen molecular ion. This is an ab initio deep learning method that solves the Schrödinger equation with the Coulomb potential yielding realistic wavefunctions that include a cusp at the ion positions. The neural solutions are continuous and differentiable functions of the interatomic distance and their derivatives are analytically calculated by applying automatic differentiation. Such a parametric and analytical form of the solutions is useful for further calculations such as the determination of force fields.

2211.04607.pdf
Ziyan Zhu, Marios Mattheakis, Weiwei Pan, and Efthimios Kaxiras. 2022. “HubbardNet: Efficient Predictions of the Bose-Hubbard Model Spectrum with Deep Neural Networks.” In NeurIPS Workshop on Machine Learning and Physical Sciences. Publisher's VersionAbstract

We present a deep neural network (DNN)-based model, the HubbardNet, to vari- ationally solve for the ground state and excited state wavefunctions of the one- dimensional and two-dimensional Bose-Hubbard model on a square lattice. Using this model, we obtain the Bose-Hubbard energy spectrum as an analytic function of the Coulomb parameter, U , and the total number of particles, N , from a single training, bypassing the need to solve a new hamiltonian for each different input. We show that the DNN-parametrized solutions have excellent agreement with exact di- agonalization while outperforming exact diagonalization in terms of computational scaling, suggesting that our model is promising for efficient, accurate computation of exact phase diagrams of many-body lattice hamiltonians.

Henry Jin, Marios Mattheakis, and Pavlos Protopapas. 2022. “Physics-Informed Neural Networks for Quantum Eigenvalue Problems.” In IJCNN at IEEE World Congress on Computational Intelligence. Publisher's VersionAbstract
Eigenvalue problems are critical to several fields of science and engineering. We expand on the method of using unsupervised neural networks for discovering eigenfunctions and eigenvalues for differential eigenvalue problems. The obtained solutions are given in an analytical and differentiable form that identically satisfies the desired boundary conditions. The network optimization is data-free and depends solely on the predictions of the neural network. We introduce two physics-informed loss functions. The first, called ortho-loss, motivates the network to discover pair-wise orthogonal eigenfunctions. The second loss term, called norm-loss, requests the discovery of normalized eigenfunctions and is used to avoid trivial solutions. We find that embedding even or odd symmetries to the neural network architecture further improves the convergence for relevant problems. Lastly, a patience condition can be used to automatically recognize eigenfunction solutions. This proposed unsupervised learning method is used to solve the finite well, multiple finite wells, and hydrogen atom eigenvalue quantum problems.
2022_pinn_quantum.pdf
Raphael Pellegrin, Blake Bullwinkel, Marios Mattheakis, and Pavlos Protopapas. 2022. “Transfer Learning with Physics-Informed Neural Networks for Efficient Simulation of Branched Flows.” In NeurIPS Workshop on Machine Learning and Physical Sciences. Publisher's VersionAbstract
Physics-Informed Neural Networks (PINNs) offer a promising approach to solving differential equations and, more generally, to applying deep learning to problems in the physical sciences. We adopt a recently developed transfer learning approach for PINNs and introduce a multi-head model to efficiently obtain accurate solutions to nonlinear systems of ordinary differential equations with random potentials. In particular, we apply the method to simulate stochastic branched flows, a universal phenomenon in random wave dynamics. Finally, we compare the results achieved by feed forward and GAN-based PINNs on two physically relevant transfer learning tasks and show that our methods provide significant computational speedups in comparison to standard PINNs trained from scratch.
2020
Alessandro Paticchio, Tommaso Scarlatti, Marios Mattheakis, Pavlos Protopapas, and Marco Brambilla. 12/2020. “Semi-supervised Neural Networks solve an inverse problem for modeling Covid-19 spread.” In 2020 NeurIPS Workshop on Machine Learning and the Physical Sciences. NeurIPS. Publisher's VersionAbstract

Studying the dynamics of COVID-19 is of paramount importance to understanding the efficiency of restrictive measures and develop strategies to defend against up- coming contagion waves. In this work, we study the spread of COVID-19 using a semi-supervised neural network and assuming a passive part of the population remains isolated from the virus dynamics. We start with an unsupervised neural network that learns solutions of differential equations for different modeling param- eters and initial conditions. A supervised method then solves the inverse problem by estimating the optimal conditions that generate functions to fit the data for those infected by, recovered from, and deceased due to COVID-19. This semi-supervised approach incorporates real data to determine the evolution of the spread, the passive population, and the basic reproduction number for different countries.

2020_covid_2010.05074.pdf
Henry Jin, Marios Mattheakis, and Pavlos Protopapas. 12/2020. “Unsupervised Neural Networks for Quantum Eigenvalue Problems.” In 2020 NeurIPS Workshop on Machine Learning and the Physical Sciences. NeurIPS. Publisher's VersionAbstract
Eigenvalue problems are critical to several fields of science and engineering. We present a novel unsupervised neural network for discovering eigenfunctions and eigenvalues for differential eigenvalue problems with solutions that identically satisfy the boundary conditions. A scanning mechanism is embedded allowing the method to find an arbitrary number of solutions. The network optimization is data-free and depends solely on the predictions. The unsupervised method is used to solve the quantum infinite well and quantum oscillator eigenvalue problems.
2020_eigenvalues_2010.05075.pdf
2019
Marios Mattheakis, Matthias Maier, Wei Xi Boo, and Efthimios Kaxiras. 9/2019. “Graphene epsilon-near-zero plasmonic crystals.” In NANOCOM '19 Proceedings of the Sixth Annual ACM International Conference on Nanoscale Computing and Communication. Dublin, Ireland. Publisher's VersionAbstract
Plasmonic crystals are a class of optical metamaterials that consist of engineered structures at the sub-wavelength scale. They exhibit optical properties that are not found under normal circumstances in nature, such as negative-refractive-index and epsilon-near-zero (ENZ) behavior. Graphene-based plasmonic crystals present linear, elliptical, or hyperbolic dispersion relations that exhibit ENZ behavior, normal or negative-index diffraction. The optical properties can be dynamically tuned by controlling the operating frequency and the doping level of graphene. We propose a construction approach to expand the frequency range of the ENZ behavior. We demonstrate how the combination of a host material with an optical Lorentzian response in combination with a graphene  conductivity that follows a Drude model leads to an ENZ condition spanning a large  frequency range.
1906.00018.pdf
2017
O. V. Shramkova, Marios Mattheakis, and G. P. Tsironis. 2017. “Amplification of surface plasmons in active nonlinear hyperbolic systems.” In 47th European Microwave Conference (EuMC), Pp. 488-491. Nuremberg, Germany. Publisher's VersionAbstract
In this paper, we study propagation of surface waves at a boundary of an amplifying isotropic medium and hyperbolic metamaterial. We demonstrate that the gain material can be used to counterbalance the losses in hyperbolic medium. We show that the gain-loss balance can be maintained even in the presence of nonlinear saturation leading to the surface wave amplification.
activesp_ieeeproceedings_2017.pdf
2014
C. Athanasopoulos, M Mattheakis, and G. P. Tsironis. 2014. “Enhanced surface plasmon polariton propagation induced by active dielectrics.” In Expert from the Proceedings of the 2014 COMSOL conference in Cambridge. Cambridge, UK: COMSOL. Publisher's VersionAbstract

We present numerical simulations for the propagation of surface plasmon polaritons (SPPs) in a dielectric-metal-dielectric waveguide using COMSOL multiphysics software. We show that the use of an active dielectric with gain that compensates metal absorption losses enhances substantially plasmon propagation. Furthermore, the introduction of the active material induces, for a specific gain value, a root in the imaginary part of the propagation constant leading to infinite propagation of the surface plasmon. The computational approaches analyzed in this work can be used to define and tune the optimal conditions for surface plasmon polariton amplification and propagation.

mattheakis_activeSPPsCOMSOL.pdf