COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Machine learning in Physics, Chemistry and Materials discussion group (MLDG) > Neural Equivariant Interatomic Potentials
Neural Equivariant Interatomic PotentialsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Bingqing Cheng . Representations of atomistic systems for machine learning must transform predictably under the geometric transformations of 3D space, in particular rotation, translation, mirrors, and permutation of atoms of the same species. These constraints are typically satisfied by means of atomistic representations that depend on scalar distances and angles, leaving the representation invariant under the above transformations. Invariance, however, limits the expressivity and can lead to an incompleteness of representations. In order to overcome this shortcoming, we recently introduced Neural Equviariant Interatomic Potentials [1], a Graph Neural Network approach for learning interatomic potentials that uses a SE(3)-equivariant representation of atomic environments. While most current Graph Neural Network interatomic potentials use invariant convolutions over scalar features, NequIP instead employs equivariant convolutions over geometric tensors (scalar, vectors, …), providing a more information-rich message passing scheme. In my talk, I will first motivate the choice of an equivariant representation for atomistic systems and demonstrate how it allows for the design of interatomic potentials at previously unattainable accuracy. I will discuss applications on a diverse set of molecules and materials, including small organic molecules, water in different phases, a catalytic surface reaction, glass formation of a lithium phosphate, and Li diffusion in a superionic conductor. I will then show that NequIP can predict structural and kinetic properties from molecular dynamics simulations in excellent agreement with ab-initio simulations. The talk will then discuss the observation of a remarkable sample efficiency in equivariant interatomic potentials which outperform existing neural network potentials with up to 1000x fewer training data and rival or even surpass the sample efficiency of kernel methods. Finally, I will discuss potential reasons for the high sample efficiency of equivariant interatomic potentials. [1] https://arxiv.org/abs/2101.03164 This talk is part of the Machine learning in Physics, Chemistry and Materials discussion group (MLDG) series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsTriple Helix Cambridge ইন্টারনেট মানেই বিশ্বাসযোগ্য নয় St. John's Women's Society TalksOther talksEdge-anchored micromachined disk resonator for gyroscopic application Therapies for hereditary retinal diseases The making of a Pastorian empire: tuberculosis and bacteriological technopolitics in French colonialism and international science, 1890–1940 Towards Humble Geographies An introduction to active noise reduction Beta Pictoris: a laboratory of planetary system formation... and of planetary systems investigation. |