BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Approximate Inference for the Loss-Calibrated Bayesian - Simon Lac
 oste-Julien (Cambridge)
DTSTART:20110218T160000Z
DTEND:20110218T170000Z
UID:TALK28560@talks.cam.ac.uk
CONTACT:Richard Nickl
DESCRIPTION:Bayesian decision theory provides a well-defined theoretical f
 ramework\nfor rational decision making under uncertainty. However\, even i
 f we\nassume that our subjective beliefs about the world have been\nwell-s
 pecified\, we usually need to resort to approximations in order\nto use th
 em in practice. Despite the central role of the loss in the\ndecision theo
 ry formulation\, most prevalent Bayesian approximation\nmethods focus on a
 pproximating the posterior over parameters with no\nconsideration of the l
 oss. In this talk\, our main point is to bring\nback in focus the need to 
 *calibrate* the approximation methods to the\n*loss* under consideration. 
 This philosophy has already been widely\napplied in the frequentist statis
 tics / discriminative machine\nlearning literature\, as for example with t
 he use of surrogate loss\nfunctions\, but not in Bayesian statistics surpr
 isingly. We provide\nexamples showing the limitation of disregarding the l
 oss in standard\napproximate inference schemes and outline several interes
 ting research\ndirections arising from this new perspective. As a first\nl
 oss-calibrated attempt\, we propose an EM-like algorithm on the\nBayesian 
 posterior risk and show how it can improve a standard\napproach to Gaussia
 n process classification when the losses are\nasymmetric.\n\nhttp://mlg.en
 g.cam.ac.uk/slacoste/\n
LOCATION:MR12\, CMS\, Wilberforce Road\, Cambridge\, CB3 0WB
END:VEVENT
END:VCALENDAR
