Representations in neural network based empirical potentials

Citation:

Cubuk ED, Malone BD, Onat B, Waterland A, Kaxiras E. Representations in neural network based empirical potentials. JOURNAL OF CHEMICAL PHYSICS. 2017;147 (2).

Date Published:

JUL 14

Abstract:

Many structural and mechanical properties of crystals, glasses, and biological macromolecules can be modeled from the local interactions between atoms. These interactions ultimately derive from the quantum nature of electrons, which can be prohibitively expensive to simulate. Machine learning has the potential to revolutionize materials modeling due to its ability to efficiently approximate complex functions. For example, neural networks can be trained to reproduce results of density functional theory calculations at a much lower cost. However, how neural networks reach their predictions is not well understood, which has led to them being used as a ``black box'' tool. This lack of understanding is not desirable especially for applications of neural networks in scientific inquiry. We argue that machine learning models trained on physical systems can be used as more than just approximations since they had to ``learn'' physical concepts in order to reproduce the labels they were trained on. We use dimensionality reduction techniques to study in detail the representation of silicon atoms at different stages in a neural network, which provides insight into how a neural network learns to model atomic interactions. Published by AIP Publishing.