BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Variational methods continued - David Knowles\, Machine learning g
 roup\, University of Cambridge
DTSTART:20100303T163000Z
DTEND:20100303T173000Z
UID:TALK23576@talks.cam.ac.uk
CONTACT:Richard Samworth
DESCRIPTION:I will continue Silvia's discussion of variational methods\, a
 nd try to\nanswer some of the questions raised during her talk. We saw how
  mean\nfield inference can give a lower bound on the log partition functio
 n\nZ. I will describe a general message passing framework based on alpha\n
 divergences\, which has mean field and expectation propagation (EP\, a\nge
 neralisation of belief propagation for continuous random variables)\nas sp
 ecial cases\, as well as tree reweighted belief propagation\, which\ncan g
 ive an upper bound on Z. EP will be shown to greatly outperform\nboth Gibb
 s sampling and mean field on certain problems. If there is\nenough time I 
 will present some work on choosing the optimal tree to\napproximate a loop
 y graph or even the right distribution over trees.\n
LOCATION:MR5\, CMS
END:VEVENT
END:VCALENDAR
