University of Cambridge > Talks.cam > Machine learning in Physics, Chemistry and Materials discussion group (MLDG) > Neural Equivariant Interatomic Potentials

Neural Equivariant Interatomic Potentials

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Bingqing Cheng .

Representations of atomistic systems for machine learning must transform predictably under the geometric transformations of 3D space, in particular rotation, translation, mirrors, and permutation of atoms of the same species. These constraints are typically satisfied by means of atomistic representations that depend on scalar distances and angles, leaving the representation invariant under the above transformations. Invariance, however, limits the expressivity and can lead to an incompleteness of representations. In order to overcome this shortcoming, we recently introduced Neural Equviariant Interatomic Potentials [1], a Graph Neural Network approach for learning interatomic potentials that uses a SE(3)-equivariant representation of atomic environments. While most current Graph Neural Network interatomic potentials use invariant convolutions over scalar features, NequIP instead employs equivariant convolutions over geometric tensors (scalar, vectors, …), providing a more information-rich message passing scheme. In my talk, I will first motivate the choice of an equivariant representation for atomistic systems and demonstrate how it allows for the design of interatomic potentials at previously unattainable accuracy. I will discuss applications on a diverse set of molecules and materials, including small organic molecules, water in different phases, a catalytic surface reaction, glass formation of a lithium phosphate, and Li diffusion in a superionic conductor. I will then show that NequIP can predict structural and kinetic properties from molecular dynamics simulations in excellent agreement with ab-initio simulations. The talk will then discuss the observation of a remarkable sample efficiency in equivariant interatomic potentials which outperform existing neural network potentials with up to 1000x fewer training data and rival or even surpass the sample efficiency of kernel methods. Finally, I will discuss potential reasons for the high sample efficiency of equivariant interatomic potentials.

[1] https://arxiv.org/abs/2101.03164

This talk is part of the Machine learning in Physics, Chemistry and Materials discussion group (MLDG) series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity