BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Attention in sequence-to-sequence models - Jacky Kung
DTSTART:20211020T180000Z
DTEND:20211020T183000Z
UID:TALK164278@talks.cam.ac.uk
CONTACT:Matthew Ireland
DESCRIPTION:Sequence-to-sequence models are used to map sequential data fr
 om one domain to another\, and are at the core of machine translation (MT)
  technologies such as Google Translate. In this talk\, we build up the maj
 or milestones of neural machine translation. This talk will also introduce
  the attention mechanism\, which forms the basis of virtually all state-of
 -the-art machine translation techniques we have today.
LOCATION:Online\, via MS Teams
END:VEVENT
END:VCALENDAR
