BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Collaborative Pretraining on Evolving Pretraining and Small Manage
 able Tasks - Leshem Choshen (IBM AI research\, Hebrew University of Jerusa
 lem)
DTSTART:20231020T110000Z
DTEND:20231020T120000Z
UID:TALK207076@talks.cam.ac.uk
CONTACT:Richard Diehl Martinez
DESCRIPTION:Pretraining is monolithic. In this talk\, I will discuss a col
 laborative approach to pertaining\, by iterative model merging (originally
  fusing). We will then discuss making evaluation reliable and efficient\, 
 to allow anyone to evaluate. We might mention the BabyLM challenge\, of pr
 etraining models with human feasible amount of data as well (If interested
  in more\, contact me\, babyLM would be CoNLL's shared task next year as w
 ell).\n\nLeshem Choshen is a postdoctoral researcher at MIT-IBM\, aiming t
 o collaboratively pretrain through model recycling\, efficient evaluation\
 , and efficient pretraining research (e.g.\, babyLM). He received the post
 doctoral Rothschild and Fulbright fellowship as well as IAAI and Blavatnik
  best Ph.D. awards. With broad NLP and ML interests\, he also worked on Re
 inforcement Learning\, Evaluation and Understanding of how neural networks
  learn. In parallel\, he participated in Project Debater\, creating a mach
 ine that could hold a formal debate\, ending in a Nature cover and live de
 bate. \n\nHe is also a dancer and runs tei.ma\, a food and science blog (N
 isuiVeTeima on Instagram\, Facebook and Tiktok).
LOCATION:https://cam-ac-uk.zoom.us/j/86071371348?pwd=OVlqdDhZNHlGbzV5RUZrS
 zM1cUlhUT09#success
END:VEVENT
END:VCALENDAR
