BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:The Transformer (OOD) House of Cards - Petar Veličković (Google 
 Deepmind)
DTSTART:20241010T130000Z
DTEND:20241010T140000Z
UID:TALK222952@talks.cam.ac.uk
CONTACT:Sri Aitken
DESCRIPTION:The Transformer architecture has certainly been the landmark d
 eep learning model in recent years\, enabling seamless integration of info
 rmation across many different modalities and surprisingly insightful behav
 iours emerging at scale. However\, in spite of the very challenging proble
 ms that are now within reach of Transformers\, they are also seemingly una
 ble to robustly perform when faced with variations of\, comparatively\, mu
 ch simpler problems. We attribute this to shaky foundations: there are cer
 tain kinds of computations that are always going to be out of reach of Tra
 nsformers\, no matter how well we train them -- and a lot of such computat
 ions occur outside of the distribution the model was trained on. In this t
 alk\, I will outline some of these cracks in the system we've discovered\,
  as well as ideas for the way forward towards building generally intellige
 nt agents of the future.
LOCATION:Maxwell Centre
END:VEVENT
END:VCALENDAR
