University of Cambridge > Talks.cam > DAMTP Data Intensive Science Seminar > Self-learning Monte Carlo method with equivariant Transformer

Self-learning Monte Carlo method with equivariant Transformer

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Sven Krippendorf.

Machine learning and deep learning have revolutionized computational physics, particularly the simulation of complex systems. Equivariance is essential for simulating physical systems because it imposes a strong inductive bias on the probability distribution described by a machine learning model. However, imposing symmetry on the model can sometimes lead to poor acceptance rates in self-learning Monte Carlo (SLMC). Here, we introduce a symmetry equivariant attention mechanism for SLMC , which can be systematically improved. We evaluate our architecture on a spin-fermion model (i.e. double exchange model) on a two-dimensional lattice. Our results show that the proposed method overcomes the poor acceptance rates of linear models and exhibits a similar scaling law to large language models, with model quality monotonically increasing with the number of layers [1]. Our work paves the way for the development of more accurate and efficient Monte Carlo algorithms with machine learning for simulating complex physical systems.

[1] YN and A. Tomiya, J. Phys. Soc. Jpn. 93, 114007 (2024)

This talk is part of the DAMTP Data Intensive Science Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity