BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Transformer: the 3rd generation neural network acoustic models for
  ASR and its application at Facebook - Yongqiang Wang\, Facebook
DTSTART:20201130T150000Z
DTEND:20201130T160000Z
UID:TALK154384@talks.cam.ac.uk
CONTACT:Dr Kate Knill
DESCRIPTION:Since the introduction of deep learning to automatic speech re
 cognition (ASR)\, neural network architectures have evolved rapidly from f
 eed-forward networks to recurrent networks. Recently\, in natural language
  processing are\, transformer network-based sequence modeling has demonstr
 ated strong results over recurrent network-based one\, in terms of both mo
 deling accuracy and inference speed.  However\, it is non-trivial to adopt
  the transformer architecture in speech recognition due to the unique requ
 irement in ASR like streaming processing. In this talk\, we showed that ho
 w the transformer architecture can be modified to fit different latency re
 quirements for a range of speech applications. Specifically\, we augmented
  the attention module in transformer with a set of memory slots\, results 
 in an efficient memory transformer\, Emformer. We compare our Emformer wit
 h LSTM-based acoustic model under both low latency and medium latency scen
 arios\, on the widely used librispeech benchmark and a series of industria
 l scale tasks\, whose training data ranges from 9K hours to 2.2M hours. We
  showed that on the medium latency tasks\, Emformer provides 10-20% error 
 reduction and 2-3x inference speed up\; on the low latency task\, Emformer
  achieved similar word rate reduction at a cost of slightly increased real
  time factors (RTF). By presenting these results\, we hope that we can con
 vince the audience that transformer could become the third generation of n
 eural acoustic model for both traditional hybrid and end-to-end ASR system
 s. 
LOCATION:Zoom: https://zoom.us/j/94591123432?pwd=bUJObFZ3UnFYLy9pWENDcS9aY
 UZqUT09
END:VEVENT
END:VCALENDAR
