BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Approximate Inference for the Loss-Calibrated Bayesian - Simon Lac
 oste-Julien (University of Cambridge)
DTSTART:20100726T100000Z
DTEND:20100726T110000Z
UID:TALK25414@talks.cam.ac.uk
CONTACT:Emli-Mari Nel
DESCRIPTION:Bayesian decision theory provides a well-defined theoretical f
 ramework for rational decision making under uncertainty. However\, even if
  we assume that our subjective beliefs about the world have been well-spec
 ified\, we usually need to resort to approximations in order to use them i
 n practice. Despite the central role of the loss in the decision theory fo
 rmulation\, most prevalent approximation methods seem to focus on approxim
 ating the posterior over parameters with no consideration to the loss. In 
 this talk\, our main point is to bring back in focus the need to *calibrat
 e* the approximation methods to the *loss* under consideration. This philo
 sophy has already been widely applied in the frequentist statistics / disc
 riminative machine learning literature\, as for example with the use of su
 rrogate loss functions. In contrast\, the “loss-calibrated” approximat
 ion approach seems to have been mainly limited in Bayesian statistics to s
 imple settings and losses such as regression with quadratic loss or hypoth
 esis testing with 0-1 loss. We provide examples showing the limitation of 
 disregarding the loss in standard approximate inference schemes and explor
 e loss-calibrated alternatives.
LOCATION:TCM Seminar Room\, Cavendish Laboratory\, Department of Physics
END:VEVENT
END:VCALENDAR
