BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Deep Learning - Professor Geoffrey Hinton FRS (U. Toronto and Goog
 le)
DTSTART:20150625T100000Z
DTEND:20150625T110000Z
UID:TALK59916@talks.cam.ac.uk
CONTACT:Zoubin Ghahramani
DESCRIPTION:I will describe an efficient\, unsupervised learning procedure
  for a simple type of two-layer neural network called a Restricted Boltzma
 nn Machine.  I will then show how this algorithm can be used recursively t
 o learn multiple layers of features without requiring any supervision. Aft
 er this unsupervised "pre-training"\, the features in all layers can be fi
 ne-tuned to be better at discriminating between classes by using the stand
 ard backpropagation procedure from the 1980s. Unsupervised pre-training gr
 eatly improves generalization to new data\, especially when the number of 
 labelled examples is small.  Ten years ago\, the pre-training approach ini
 tiated a revival of research on deep\, feedforward neural networks. I will
  describe some of the major successes of deep networks for speech recognit
 ion\, object recognition and machine translation and I will speculate abou
 t where this research is headed. The fact that backpropagation learning is
  now the method of choice for a wide variety of really difficult tasks mea
 ns that neuroscientists may need to reconsider their well-worn arguments a
 bout why it cannot possibly be occurring in cortex. I shall conclude by un
 dermining two of the commonest objections to the idea that cortex is actua
 lly backpropagating error derivatives through a hierarchy of cortical area
 s and I shall show that spike-time dependent plasticity is a signature of 
 backpropagation.\n
LOCATION: Cambridge University Engineering Department\, Lecture Theatre 0
END:VEVENT
END:VCALENDAR
