BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:On diffusion-based generative models and their error bounds: The l
 og-concave case with full convergence estimates - Ying Zhang (Hong Kong Un
 iversity of Science and Technology (Guangzhou))
DTSTART:20240718T090000Z
DTEND:20240718T100000Z
UID:TALK218941@talks.cam.ac.uk
DESCRIPTION:Diffusion-based generative models are a recent class of genera
 tive models showing state-of- art performances in many data generation tas
 ks. These models use a forward process to progressively corrupt samples fr
 om a target data distribution with noise and then learn to reverse this pr
 ocess for generation of new samples. In this talk\, we provide full theore
 tical guarantees for the convergence behaviour of such models. We demonstr
 ate via a motivating example\, sampling from a Gaussian distribution with 
 unknown mean\, the powerfulness of our approach. In this case\, explicit e
 stimates are provided for the associated optimization problem\, i.e. score
  approximation\, while these are combined with the corresponding sampling 
 estimates. As a result\, we obtain the best known estimates for the Wasser
 stein distance of order two between the data distribution and our sampling
  algorithm. Beyond the motivating example\, we present our results for sam
 pling from strongly log-concave distributions using an $L^2$-accurate scor
 e estimation assumption\, which is formed under an expectation with respec
 t to a stochastic optimizer and our novel auxiliary process that uses only
  known information.
LOCATION:External
END:VEVENT
END:VCALENDAR
