BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Self-learning Monte Carlo method with equivariant Transformer - Yu
 ki Nagai (University of Tokyo)
DTSTART:20250326T144500Z
DTEND:20250326T151500Z
UID:TALK229648@talks.cam.ac.uk
CONTACT:Sven Krippendorf
DESCRIPTION:Machine learning and deep learning have revolutionized computa
 tional physics\, particularly the simulation of complex systems. Equivaria
 nce is essential for simulating physical systems because it imposes a stro
 ng inductive bias on the probability distribution described by a machine l
 earning model. However\, imposing symmetry on the model can sometimes lead
  to poor acceptance rates in self-learning Monte Carlo (SLMC). Here\, we i
 ntroduce a symmetry equivariant attention mechanism for SLMC\, which can b
 e systematically improved. We evaluate our architecture on a spin-fermion 
 model (i.e. double exchange model) on a two-dimensional lattice. Our resul
 ts show that the proposed method overcomes the poor acceptance rates of li
 near models and exhibits a similar scaling law to large language models\, 
 with model quality monotonically increasing with the number of layers [1].
  Our work paves the way for the development of more accurate and efficient
  Monte Carlo algorithms with machine learning for simulating complex physi
 cal systems.\n\n[1] YN and A. Tomiya\, J. Phys. Soc. Jpn. 93\, 114007 (202
 4)
LOCATION:DAMTP\, MR11
END:VEVENT
END:VCALENDAR
