Attention in sequence-to-sequence models
- 👤 Speaker: Jacky Kung
- 📅 Date & Time: Wednesday 20 October 2021, 19:00 - 19:30
- 📍 Venue: Online, via MS Teams
Abstract
Sequence-to-sequence models are used to map sequential data from one domain to another, and are at the core of machine translation (MT) technologies such as Google Translate. In this talk, we build up the major milestones of neural machine translation. This talk will also introduce the attention mechanism, which forms the basis of virtually all state-of-the-art machine translation techniques we have today.
Series This talk is part of the Churchill CompSci Talks series.
Included in Lists
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 20 October 2021, 19:00-19:30