BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Adapters in Transformers. A New Paradigm for Transfer Learning…?
  - Jonas Pfeiffer\, Technical University of Darmstadt
DTSTART:20210211T110000Z
DTEND:20210211T120000Z
UID:TALK157300@talks.cam.ac.uk
CONTACT:Marinela Parovic
DESCRIPTION:Adapters have recently been introduced as an alternative trans
 fer learning strategy. Instead of fine-tuning all weights of a pre-trained
  transformer-based model\, small neural network components are introduced 
 at every layer. While the pre-trained parameters are frozen\, only the new
 ly introduced adapter weights are fine-tuned\, achieving an encapsulation 
 of the down-stream task information in designated parts of the model. In t
 his talk we will provide an introduction to adapter-training in natural la
 nguage processing. We will go into detail on how the encapsulated knowledg
 e can be leveraged for compositional transfer learning\, as well as cross-
 lingual transfer. We will briefly touch the efficiency of adapters in term
 s of trainable parameters as well as (wall-clock) training time. Finally\,
  we will provide an outlook to recent alternative adapter approaches and t
 raining strategies. 
LOCATION:https://cam-ac-uk.zoom.us/j/97599459216?pwd=QTRsOWZCOXRTREVnbTJBd
 XVpOXFvdz09
END:VEVENT
END:VCALENDAR
