BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Divergence measures and message passing - CANCELLED - David Knowle
 s (University of Cambridge)
DTSTART:20090205T140000Z
DTEND:20090205T153000Z
UID:TALK15406@talks.cam.ac.uk
CONTACT:Shakir Mohamed
DESCRIPTION:Unfortunately I've picked up a horrible fluey thing so I am ca
 ncelling. Apologies to all.\n\nThis paper by Tom Minka presents a unifying
  view of message-passing algorithms\, as methods to approximate a complex 
 Bayesian network by a simpler network with minimum information divergence.
  In this view\, the difference between mean-field methods and belief propa
 gation is not the amount of structure they model\, but only the measure of
  loss they minimize: `exclusive' versus `inclusive' Kullback-Leibler diver
 gence. Both these divergence measures can be viewed as examples of alpha-d
 ivergence for specific values of alpha. In each case\, message-passing ari
 ses by minimizing a localized version of the divergence\, local to each fa
 ctor. By examining these divergence measures\, we can intuit the types of 
 solution they prefer (symmetry-breaking\, for example) and their suitabili
 ty for different tasks. Furthermore\, by considering a wider variety of di
 vergence measures (such as alpha-divergences)\, we can achieve different c
 omplexity and performance goals.\n\nftp://ftp.research.microsoft.com/pub/t
 r/TR-2005-173.pdf
LOCATION:Engineering Department\, CBL Room 438
END:VEVENT
END:VCALENDAR
