BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Bayesian Computation Without Tears: Probabilistic Programming and 
 Universal Stochastic Inference - Vikash Mansinghka\, MIT
DTSTART:20111121T110000Z
DTEND:20111121T120000Z
UID:TALK34618@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:Latent variable modeling and Bayesian inference are appealing 
 in theory they provide a unified mathematical framework for solving a wide
  range of machine learning problems but are often difficult to apply effec
 tively in practice. Accurate inference in even simple models can seem comp
 utationally intractable\, while more realistic models are difficult to eve
 n write down precisely.\n\nIn this talk\, I will introduce new probabilist
 ic programming technology that alleviates many of these difficulties. Unli
 ke graphical models\, which marries statistics with graph theory\, probabi
 listic programming marries Bayesian inference with universal computation. 
 Probabilistic programming can make it easier to build useful\, fast machin
 e learning software that goes significantly beyond graphical models in fle
 xibility and power. I will illustrate probabilistic programming using page
 -long probabilistic programs that break simple CAPTCHAs by running randomi
 zed CAPTCHA generators backwards and interpret noisy time-series data from
  clinical medicine.\n\nI will also present CrossCat\, a black-box\, parame
 ter free\, fully Bayesian machine learning method\, based on an optimized 
 engine for one probabilistic program that learns simple but flexible proba
 bilistic programs from data. CrossCat estimates the full joint distributio
 n underlying high-dimensional datasets\, including the noisy\, incomplete 
 tables that come from modern database systems. It also can efficiently sim
 ulate from any of its finite-dimensional conditional distributions and acc
 urately solves problems of prediction\, imputation\, feature selection and
  classification.\n\nThroughout\, I will highlight the ways probabilistic p
 rogramming points the way to a new model of computation\, based on univers
 al inference over distributions rather than universal calculation of funct
 ions\, and exposes the mathematical and algorithmic structure needed to en
 gineer efficient\, distributed machine learning systems. I will include a 
 brief discussion of natively probabilistic hardware that carries these pri
 nciples down to the physical level.\nI will also touch on the directions t
 his model opens up for research in computational complexity including step
 s towards an explanation of the unreasonable effectiveness of simple\, ran
 domized algorithms on apparently intractable problems and in programming l
 anguages and artificial intelligence.\n
LOCATION:Small lecture theatre\, Microsoft Research Ltd\, 7 J J Thomson Av
 enue (Off Madingley Road)\, Cambridge
END:VEVENT
END:VCALENDAR
