BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Equivariant N-centered representations for atomistic machine learn
 ing - Jigyasa Nigam\, Swiss Federal Institute of Technology Lausanne (EPFL
 )\, Switzerland
DTSTART:20220530T130000Z
DTEND:20220530T133000Z
UID:TALK173225@talks.cam.ac.uk
CONTACT:Dr Venkat Kapil
DESCRIPTION:The application of machine learning (ML) to the modeling of ma
 terials and molecules has proven to be extremely successful in acceleratin
 g the understanding\, design\, and characterization of materials. A major 
 factor in this success has been the development of representations of atom
 ic structures that reflect physics-based constraints of the targets. A cla
 ss of these structural descriptions is built by symmetrizing correlations 
 of an atom-density function\nthat describes the associated atomic environm
 ent. These local objects are then subsequently used to model corresponding
  atomic properties\, or atomic contributions to a global observable. Howev
 er\, many quantum mechanical quantities\, such as the effective single-par
 ticle Hamiltonian projected on an\natomic-orbital basis\, are associated w
 ith multiple atom-centers\, rendering the atom-centered approach inadequat
 e to describe the additional degrees of freedom.\n\nWe recently proposed a
 n N-center representation[1] that extends the atom-centered framework to t
 he case of targets that are simultaneously indexed by N atoms. Devising th
 is family of multicenter representations opens avenues for new classes of 
 machine learning models that are fully equivariant and thus incorporate mo
 lecular symmetries. Additionally\, they can be used to construct integrate
 d machine learning models\, such as by calculating N-center integrals that
  can further serve to assist electronic structure calculations. As these N
 -center representations provide information on multiple centers and their 
 connectivities\, they also serve as a framework to capture the ideas of 
 “message-passing”[2]\, which\nare widely employed in deep ML models\n\
 n[1] J. Nigam\, M. Willatt\, M. Ceriotti\, JCP 156\, 014115\, 2022\n\n[2] 
 J. Nigam\, S. Pozdnyakov\, G. Fraux\, M. Ceriotti\, arXiv:2202.01566
LOCATION:https://zoom.us/j/92447982065?pwd=RkhaYkM5VTZPZ3pYSHptUXlRSkppQT0
 9
END:VEVENT
END:VCALENDAR
