Adapters in Transformers. A New Paradigm for Transfer Learning…?
- 👤 Speaker: Jonas Pfeiffer, Technical University of Darmstadt
- 📅 Date & Time: Thursday 11 February 2021, 11:00 - 12:00
- 📍 Venue: https://cam-ac-uk.zoom.us/j/97599459216?pwd=QTRsOWZCOXRTREVnbTJBdXVpOXFvdz09
Abstract
Adapters have recently been introduced as an alternative transfer learning strategy. Instead of fine-tuning all weights of a pre-trained transformer-based model, small neural network components are introduced at every layer. While the pre-trained parameters are frozen, only the newly introduced adapter weights are fine-tuned, achieving an encapsulation of the down-stream task information in designated parts of the model. In this talk we will provide an introduction to adapter-training in natural language processing. We will go into detail on how the encapsulated knowledge can be leveraged for compositional transfer learning, as well as cross-lingual transfer. We will briefly touch the efficiency of adapters in terms of trainable parameters as well as (wall-clock) training time. Finally, we will provide an outlook to recent alternative adapter approaches and training strategies.
Series This talk is part of the Language Technology Lab Seminars series.
Included in Lists
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- https://cam-ac-uk.zoom.us/j/97599459216?pwd=QTRsOWZCOXRTREVnbTJBdXVpOXFvdz09
- Interested Talks
- Language Sciences for Graduate Students
- Language Technology Lab Seminars
- ndk22's list
- ob366-ai4er
- rp587
- Simon Baker's List
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 11 February 2021, 11:00-12:00