BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Deep Structured Prediction for Handwriting Recognition - Juan Muri
 llo Fuentes
DTSTART:20171102T133000Z
DTEND:20171102T150000Z
UID:TALK94213@talks.cam.ac.uk
CONTACT:Alessandro Davide Ialongo
DESCRIPTION:Work in collaboration with P. M. Olmos and J.C.A. Jaramillo\n\
 n*Abstract:* \n\nStructured prediction or structured (output) learning is 
 an umbrella term for supervised machine learning techniques that involve p
 redicting structured objects\, rather than scalar discrete or real values.
  Application domains include bioinformatics\, natural language processing\
 , speech recognition\, and computer vision. While convolutional nets are q
 uite useful in this task\, when looking for long term dependencies in the 
 input\, recurrent neural networks (RNN) are preferred. However\, the probl
 em of vanishing/exploding gradients prevents the use of simple RNNs and lo
 ng short-term memory (LSTM) networks are widely adopted. In this talk we f
 ocus on the handwritten text recognition problem. We review LSTM as the st
 ate-of-the-art solution within deep learning structures. Then\, to cope wi
 th the problem of (letters) labelling in an image we explain the connectio
 nist temporal classification (CTC)\, a useful tool whose cost function can
  be incorporated in the network to allow for gradient computations. We dis
 cuss some results obtained in an attempt to put this theory to work with T
 ensorFlow.\n\n*Reading list:*\n\nNo previous reading is really needed beyo
 nd general concepts of deep neural networks.\n\n*Useful references:*\n\n- 
 I. Goodfellow\, Y. Bengio\, A. Courville\, "Deep Learning". MIT Press 2016
 \, chapters 6 to 9 for concepts on deep learning\, chapter 10 in particula
 r for recurrent networks (but see below for LSTM).\n\n- M. Görner\, Tenso
 r Flow and Deep Learning without a PhD\, https://codelabs.developers.googl
 e.com/codelabs/cloud-tensorflow-mnist/#0\, quick Review of Main concepts a
 nd examples on DL\n\n- Z. C. Lipton\, John Berkowitz\, Charles Elkan\, "A 
 Critical Review of Recurrent Neural Networks for Sequence Learning"\, 2015
 \, https://arxiv.org/abs/1506.00019\n\n- K. Cho\, "Natural Language Unders
 tanding with Distributed Representation"\, 2016\, for a detailed explanati
 on of LSTMs\n\n- C. Olah\,  "Understanding LSTM Networks" 2015 http://cola
 h.github.io/posts/2015-08-Understanding-LSTMs/\, for a good explanation of
  LSTMs\n\n- A. Graves\, "Supervised sequence labelling with recurrent neur
 al networks" Ph. D. Thesis 2008\, for an explanation of the CTC
LOCATION:Engineering Department\, CBL Seminar Room 4-38
END:VEVENT
END:VCALENDAR
