BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:An Introduction to Transformer Neural Processes - Matt Ashman and 
 Cristiana Diaconu
DTSTART:20240228T110000Z
DTEND:20240228T123000Z
UID:TALK212779@talks.cam.ac.uk
CONTACT:Isaac Reid
DESCRIPTION:Neural processes (NPs) have significantly improved since their
  inception. A principal factor in their effectiveness has been advancement
 s in the architecture of permutation-invariant set functions—a notable e
 xample of which being transformer-based architectures. In this reading gro
 up session\, we will introduce participants to Transformer Neural Processe
 s (TNPs). We do not assume prior knowledge of NPs or transformers.\n\nUsef
 ul background reading:\n1) Transformer Neural Processes: Uncertainty-Aware
  Meta Learning Via Sequence Modelling. Nguyen and Grover (2022). https://a
 rxiv.org/abs/2207.04179\n2) Set Transformer: A Framework for Attention-bas
 ed Permutation-Invariant Neural Networks. Lee et al. (2018). https://arxiv
 .org/abs/1810.00825\n3) Latent Bottlenecked Attentive Neural Processes. Fe
 ng et al. (2022). https://arxiv.org/abs/2211.08458\n4) Attentive Neural Pr
 ocesses. Kim et al. (2019). https://arxiv.org/abs/1901.05761.
LOCATION:Cambridge University Engineering Department\, CBL Seminar room BE
 4-38.
END:VEVENT
END:VCALENDAR
