BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Scalable Deep Gaussian Processes - James Hensman (University of Sh
 effield)
DTSTART:20140701T100000Z
DTEND:20140701T110000Z
UID:TALK52849@talks.cam.ac.uk
CONTACT:Dr Jes Frellsen
DESCRIPTION:Gaussian process (GP) models are useful machine learning tools
 \, but are fundamentally shallow models\, and the ability to learn feature
 s is restricted to adaptations of the covariance function. To achieve deep
  learning with Gaussian processes\, they need to be stacked such that the 
 output of the first GP becomes the input for the next [Damianou and Lawren
 ce 2012]. This leads to a massive latent variable model\, where we need to
  infer the hidden values corresponding to the outputs (inputs) of each GP 
 layer. In this work\, I'll show present a novel variational technique that
  collapses these latent variables by maintaining some of the prior model s
 tructure. The result is a variational bound on the model's marginal likeli
 hood\, which can be optimized with respect to a series of inducing inputs 
 and inducing outputs for each GP layer. The form of this variational bound
  corresponds to a regularized neural network\, where the Gaussian messages
  are fed-forward and backpropagated. This new variational bound can be opt
 imised in a scalable fashion by parallelisation or stochastic methods. I'l
 l present an investigation of this bound and some preliminary results. \n\
 nDamianou A. and Lawrence N.D. \, "Deep Gaussian Processes."Proceedings of
  the Sixteenth International Conference on Artificial Intelligence and Sta
 tistics. 2013.
LOCATION:Engineering Department\, CBL Room BE-438.
END:VEVENT
END:VCALENDAR
