BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Nonparametric Bayesian times series models: infinite HMMs and beyo
 nd - Ghahramani\, Z (Cambridge)
DTSTART:20080620T151000Z
DTEND:20080620T161000Z
UID:TALK12480@talks.cam.ac.uk
CONTACT:Mustapha Amrani
DESCRIPTION:Hidden Markov models (HMMs) are one of the most widely used st
 atistical models for time series. Traditionally\, HMMs have a known struct
 ure with a fixed number of states and are trained using maximum likelihood
  techniques. The infinite HMM (iHMM) allows a potentially unbounded number
  of hidden states\, letting the model use as many states as it needs for t
 he data (Beal\, Ghahramani and Rasmussen 2002). Teh\, Jordan\, Beal and Bl
 ei (2006) showed that a form of the iHMM could be derived from the Hierarc
 hical Dirichlet Process\, and described a Gibbs sampling algorithm based o
 n this for the iHMM. I will talk about recent work we have done on infinit
 e HMMs. In\nparticular: we now have a much more efficient inference algori
 thm based on dynamic programming\, called 'Beam Sampling'\, which should m
 ake it possible to apply iHMMs to larger problems. We have also developed 
 a factorial version of the iHMM which makes it possible to have an unbound
 ed number of binary state variables\, and can be thought of as a time-seri
 es generalization of the Indian buffet process.\n\nJoint work with Jurgen 
 van Gael (Cambridge)\, Yunus Saatci (Cambridge) and Yee Whye Teh (Gatsby U
 nit\, UCL).\n
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
