BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Unsupervised cross-lingual representation learning - Sebastian Rud
 er (DeepMind)
DTSTART:20191122T120000Z
DTEND:20191122T130000Z
UID:TALK128482@talks.cam.ac.uk
CONTACT:James Thorne
DESCRIPTION:Abstract: Research in natural language processing (NLP) has se
 en many advances over the recent years\, from word embeddings to pretraine
 d language models. Most of these approaches still rely on large labelled d
 atasets\, which has constrained their success to languages where such data
  is plentiful (mostly English). In this talk\, I will give an overview of 
 approaches that learn cross-lingual representations and enable us to scale
  NLP models to more of the world's 7\,000 languages. I will cover the spec
 trum of such cross-lingual representations\, from word embeddings to deep 
 pretrained models\, with a focus on unsupervised approaches and their limi
 tations. The talk will conclude with a discussion of the cutting edge of l
 earning such representations and future directions.\n\nBio: Sebastian is a
  research scientist at DeepMind\, London. He completed his PhD in Natural 
 Language Processing at the National University of Ireland while working as
  a research scientist at a Dublin-based NLP startup. Previously\, he studi
 ed Computational Linguistics at the University of Heidelberg\, Germany and
  at Trinity College\, Dublin. His main research interests are transfer and
  cross-lingual learning. He is also interested in helping make ML and NLP 
 more accessible. You can find him at his blog http://ruder.io/.\n
LOCATION:LT1\, Computer Laboratory
END:VEVENT
END:VCALENDAR
