BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Novel MCMC and SMC schemes for Poisson-Kingman Bayesian Nonparamet
 ric mixture models - Maria Lomeli (University of Cambridge)
DTSTART:20160602T093000Z
DTEND:20160602T103000Z
UID:TALK66448@talks.cam.ac.uk
CONTACT:Louise Segar
DESCRIPTION:According to Ghahramani\, models that have a nonparametric com
 ponent give more flexibility that could lead to better predictive performa
 nce. This is because their capacity to learn does not saturate hence their
  predictions should continue to improve as we get more and more data. Furt
 hermore\, uncertainty about predictions can be fully considered thanks to 
 the Bayesian paradigm. However\, a major impediment to the widespread use 
 of Bayesian nonparametric models is the problem of inference. Over the yea
 rs\, many Markov chain Monte Carlo (MCMC) methods have been proposed but t
 hey have the shortcoming of not being general purpose. These usually rely 
 on a tailored representation of the underlying process. This is an active 
 research area because dealing with the infinite dimensional component forb
 ids the direct use of standard simulation-based methods. Existing methods 
 require a finite-dimensional representation and there are two main samplin
 g approaches to facilitate simulation: random truncation and marginalizati
 on. These two schemes are known in the literature as conditional and margi
 nal samplers.\n\nIn this talk\, I will review existing inference schemes a
 nd introduce a novel MCMC scheme for posterior sampling in Bayesian nonpar
 ametric mixture models with priors that belong to the general Poisson-King
 man class. This general scheme relies on a characterization that was deriv
 ed from the size-biased sampling generative process for Poisson-Kingman pr
 iors. It leads to new compact way of representing the infinite dimensional
  component of the model such that while explicitly representing this compo
 nent it has less memory and storage requirements than previous MCMC scheme
 s. I will present some comparative simulation results demonstrating the ef
 ficacy of the proposed MCMC algorithm against existing marginal and condit
 ional MCMC samplers for the σ-Stable Poisson-Kingman subclass. Surprising
 ly\, the size-biased sampling characterization can also be used to build a
  Sequential Monte Carlo (SMC) sampler which allows to perform inference in
  a sequential scenario. I will briefly introduce our SMC scheme and presen
 t its computational perfomance.\n\nIn the flavour of probabilistic program
 ming\, we view our contributions as a step towards wider usage of flexible
  Bayesian nonparametric models\, as it allows automated inference in proba
 bilistic programs built out of a wide variety of Bayesian nonparametric bu
 ilding blocks.\n\nMain references\n\nLomeli\, M.\, Favaro\, S.\, Teh\, Y.W
 .\, 2015\, ''A hybrid sampler for Poisson-Kingman mixture models''\, Neura
 l information Processing Systems. \n\nLomeli\, M.\, Favaro\, S.\, Jacob\, 
 P. E. and Teh\, Y. W. "An SMC sampler por Gibbs-type infinite and finite m
 ixture models"\, In preparation.
LOCATION:Engineering Department\, CBL Room BE-438
END:VEVENT
END:VCALENDAR
