A tutorial on diffusion models
- 👤 Speaker: Emile, Sasha from CBL
- 📅 Date & Time: Wednesday 09 November 2022, 11:00 - 12:30
- 📍 Venue: Cambridge University Engineering Department, CBL Seminar room BE4-38. Zoom link available upon request.
Abstract
Score-based generative models (SGMs) are a powerful class of generative models that exhibit remarkable empirical performance. Although SGMs gained widespread popularity by performing astonishingly well in text-to image generation task (DALL-E, Stable Diffusion), it has been shown quite recently that diffusion-based models are able to reach state-of-the-art quality in many other generative modeling domains in computer vision, chemistry, NLP , and climate modeling. Our talk is designed as a tutorial with no prior knowledge on diffusion models required. We will cover i) basic techniques SGMs rely on (Langevin dynamics and score matching); ii) discrete and then iii) continuous-time diffusion models; iv) relationship between variational and score-based perspectives.
No required reading, but we would suggest reading this beforehand for more engagement :) https://arxiv.org/abs/2011.13456
Please find a non exhaustive list of relevant paper (which we will mention / cover)
• A. Hyvärinen. Estimation of Non-Normalized Statistical Models by Score Matching. 2005
• J. Sohl-Dickstein, E. Weiss, N. Maheswaranathan, and S. Ganguli. Deep unsupervised learning using nonequilibrium thermodynamics. 2015
• P. Vincent. A connection between score matching and denoising autoencoders. 2011
• Y. Song and S. Ermon. Generative modeling by estimating gradients of the data distribution. 2019
• J. Ho, A. Jain, and P. Abbeel. Denoising diffusion probabilistic models. 2020
• V. De Bortoli, J. Thornton, J. Heng, and A. Doucet. Diffusion Schrödinger bridge with applications to score-based generative modeling. 2021
• Y. Song, J. Sohl-Dickstein, D. P. Kingma, A. Kumar, S. Ermon, and B. Poole. 2021. Score-Based Generative Modeling through Stochastic Differential Equations.
• C.-W. Huang, J. H. Lim, and A. C. Courville. A variational perspective on diffusion-based generative models and score matching. 2021
• Y. Song, C. Durkan, I. Murray, and S. Ermon. Maximum likelihood training of score-based diffusion models. 2021
Series This talk is part of the Machine Learning Reading Group @ CUED series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Cambridge University Engineering Department, CBL Seminar room BE4-38. Zoom link available upon request.
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- custom
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Machine Learning Reading Group
- Machine Learning Reading Group @ CUED
- Machine Learning Summary
- ML
- ndk22's list
- ob366-ai4er
- Quantum Matter Journal Club
- Required lists for MLG
- rp587
- School of Technology
- Simon Baker's List
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 09 November 2022, 11:00-12:30