Machine Learning Seminar

Transfer learning of neural network potentials for reactive chemistry


Jason Goodpaster
Department of Chemistry
College of Science & Engineering
University of Minnesota

Wednesday, November 18, 2020
3:30–4:30 pm

Online via zoom view recording here

Large, condensed phase, and extended systems impose a challenge for theoretical studies due to the compromise between accuracy and computational cost in their calculations. Machine learning methods are an approach to solve this trade-off by leveraging large data sets to train on highly accurate calculations using small molecules and then apply them to larger systems. In this study, we are developing a method to train a neural network potential with high-level wavefunction theory on targeted systems of interest that are able to describe bond breaking. We combine density functional theory calculations and higher level ab initio wavefunction calculations, such as CASPT2, to train our neural network potentials. We first train our neural network at the DFT level of theory and using an adaptive active learning training scheme, we retrain the neural network potential to a CASPT2 level of accuracy. We demonstrate the process as well as report current progress and performance of this neural network potential for molecular dynamic simulations.

Jason Goodpaster is an assistant professor of chemistry at the University of Minnesota Twin Cities. His research focuses on the development of new quantum chemistry methods and applying these methods to a wide variety of chemical systems including: metal organic frameworks, inorganic catalysis, surface enhanced ramen spectroscopy, and electrochemistry. Professor joined UMN in June 2016. Before that, he obtained his PhD at Caltech and performed his postdoctoral work at Lawrence Berkeley National Lab.