Accelerated Bayesian inference using deep learning
- đ¤ Speaker: Adam Moss (Nottingham) đ Website
- đ Date & Time: Wednesday 12 June 2019, 11:00 - 11:30
- đ Venue: Battcock Tea area
Abstract
I introduce a novel Bayesian inference tool that uses a neural network to parameterise efficient Markov Chain Monte-Carlo (MCMC) proposals. The target distribution is first transformed into a diagonal, unit variance Gaussian by a series of non-linear, invertible, and non-volume preserving flows. Neural networks are extremely expressive, and can transform complex targets to a simple latent representation from which one can efficiently sample. Using this method, I develop a nested MCMC sampler, finding excellent performance on highly curved and multi-modal analytic likelihoods. I also demonstrate it on Planck 2015 data, showing accurate parameter constraints, and calculate the evidence for simple one-parameter extensions to LCDM in ~20 dimensional parameter space.
Series This talk is part of the Cavendish Astrophysics Coffee talks series.
Included in Lists
- All Cavendish Laboratory Seminars
- All Talks (aka the CURE list)
- Battcock Tea area
- Cavendish Astrophysics Seminars
- Centre for Health Leadership and Enterprise
- Combined External Astrophysics Talks DAMTP
- Cosmology, Astrophysics and General Relativity
- Featured lists
- ME Seminar
- Neurons, Fake News, DNA and your iPhone: The Mathematics of Information
- School of Physical Sciences
- Thin Film Magnetic Talks
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Adam Moss (Nottingham) 
Wednesday 12 June 2019, 11:00-11:30