BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Cross-lingual transfer learning with multilingual masked language 
 models - Professor Mamoru Komachi - Research Professor\, Tokyo Metropolita
 n University. Visiting academic at the Department of Computer Science and 
 Technology\, University of Cambridge
DTSTART:20231122T150500Z
DTEND:20231122T155500Z
UID:TALK204514@talks.cam.ac.uk
CONTACT:Ben Karniely
DESCRIPTION:This talk presents an exploration into Multilingual Masked Lan
 guage Models (MMLMs) as an emerging asset for cross-lingual transfer learn
 ing. The focus will be on introducing the mechanisms and applications that
  position MMLMs at the forefront of advancing multilingual capabilities in
  NLP.\n\nWe'll dissect the transformer architecture that underpins MMLMs\,
  delve into the masking mechanism\, and discuss the transfer learning trai
 ning that enables these models to understand and generate multilingual tex
 t. The synergy between these components is critical for the model's lingui
 stic versatility.\n\nFurther\, the discussion will pivot to optimizing few
 -shot learning within the MMLM framework. By strategically annotating chal
 lenging instances\, we can amplify model performance. I'll present finding
 s on employing zero-shot learning techniques to identify such instances fo
 r cross-lingual transfer\, which could inform annotation strategies.\n\nAt
 tendees will gain a clear understanding of MMLMs\, informed by practical a
 pplications such as grammatical error correction and sentiment analysis\, 
 potentially stimulating further research in the domain.\n\n\nLink to join 
 virtually: https://cam-ac-uk.zoom.us/j/81322468305\n\nA recording of this 
 talk is available at the following link: https://www.cl.cam.ac.uk/seminars
 /wednesday/video/
LOCATION:Lecture Theatre 1\, Computer Laboratory\, William Gates Building
END:VEVENT
END:VCALENDAR
