BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Efficient Inference and Learning with Intractable Posteriors? Yes\
 , Please. - Diederik P. Kingma (University of Amsterdam)
DTSTART:20151014T100000Z
DTEND:20151014T110000Z
UID:TALK60264@talks.cam.ac.uk
CONTACT:Dr Jes Frellsen
DESCRIPTION:We discuss a number of recent advances in Stochastic Gradient 
 Variational Inference (SGVI).\n\n* Blending ideas from variational inferen
 ce\, deep learning and stochastic optimization\, we derive an algorithm fo
 r efficient gradient-based inference and learning with intractable posteri
 ors.\n\n* Applied to deep latent-variable models with neural networks as c
 omponents\, this results in the Variational Auto-Encoder (VAE)\, a princip
 led Bayesian auto-encoder. We show that VAEs can be useful for semi-superv
 ised learning and analogic reasoning.\n\n* Further improvements are realiz
 ed through a new variational bound with auxiliary variables. Markov Chain 
 Monte Carlo (MCMC) can be cast as variational inference with auxiliary var
 iables\; this interpretation allows principled optimization of MCMC parame
 ters to greatly improve MCMC efficiency.\n\n* When applying SGVI to global
  parameters\, we show how an order of magnitude of variance reduction can 
 be achieved through local reparameterization while retaining parallelizabi
 lity. Gaussian Dropout can be cast as a special case of such SGVI with a s
 cale-free prior. This variational interpretation of dropout allows for sim
 ple optimization of dropout rates.\n
LOCATION:Engineering Department\, CBL Room BE-438
END:VEVENT
END:VCALENDAR
