How to Pay Attention: Learning to Transfer Knowledge between Sentences and Tokens
- đ¤ Speaker: Marek Rei ( University of Cambridge)
- đ Date & Time: Thursday 09 May 2019, 11:00 - 12:00
- đ Venue: Board room, Faculty of English, 9 West Rd (Sidgwick Site)
Abstract
Self-attention architectures allow models to dynamically decide which areas of the input should receive more focus. During the construction of text representations, attention weights also provide a way of quantifying the importance of different input areas. In this talk, we investigate how attention mechanisms can be turned into sequence labelers, opening up some new and interesting applications. These networks learn to predict labels for individual tokens, based only on sentence-level supervision, even without having seen any examples of sequence labeling. In addition, optimizing on the token level explicitly teaches the model where it should be focusing, leading to improvements in text classification. We will also discuss experiments with learning directly from the human cognitive signal, guiding the models to internally behave more like their users. The resulting architectures for text classification and sequence labeling are more accurate, more interpretable and make decisions in more predictable ways.
Series This talk is part of the Language Technology Lab Seminars series.
Included in Lists
- bld31
- Board room, Faculty of English, 9 West Rd (Sidgwick Site)
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- Interested Talks
- Language Sciences for Graduate Students
- Language Technology Lab Seminars
- ndk22's list
- ob366-ai4er
- rp587
- Simon Baker's List
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 09 May 2019, 11:00-12:00