BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Convergent and Scalable Algorithms for Expectation Propagation App
 roximate Bayesian Inference - Matthias Seeger\, EPFL
DTSTART:20120925T140000Z
DTEND:20120925T150000Z
UID:TALK40067@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:Abstract:\n\nThe expectation propagation (or adaptive TAP) rel
 axation stands out among variational relaxations of Bayesian inference\, w
 hen it comes to generality and accuracy of results. It is widely used in m
 achine learning today.\nApplied to large scale continuous variable models 
 for inverse problems in imaging and computer vision\, commonly used solver
 s lack convergence proofs and are too slow to be useful. In this talk\, we
  describe a novel EP algorithm which is both provably convergent and can b
 e scaled up to large densely connected models\, drawing a connection betwe
 en the double loop algorithm of Opper and Winther (JMLR 2005) and earlier 
 work by the author on scalable algorithms for simpler relaxations. Even fo
 r problems of moderate size (such as Gaussian process classification with 
 a few thousand training points)\, the new algorithm converges at least an 
 order of magnitude faster than the standard (sequential) EP algorithm.\n\n
 Partly joint work with Hannes Nickisch.\n\n\nSpeaker information:\nProfess
 or Matthias Seeger runs the Laboratory for Probabilistic Machine Learning 
 (LAPMAP) at EPFL\, http://lapmal.epfl.ch/
LOCATION:Small lecture theatre\, Microsoft Research Ltd\, 7 J J Thomson Av
 enue (Off Madingley Road)\, Cambridge
END:VEVENT
END:VCALENDAR
