University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Towards a non-asymptotic understanding of diffusion-based generative models

Towards a non-asymptotic understanding of diffusion-based generative models

Add to your list(s) Download to your calendar using vCal

  • UserYuting Wei (University of Pennsylvania)
  • ClockThursday 04 July 2024, 10:30-12:00
  • HouseExternal.

If you have a question about this talk, please contact nobody.

DML - Diffusions in machine learning: Foundations, generative models and non-convex optimisation

Diffusion models, which convert noise into new data instances by learning to reverse a Markov diffusion process, have become a cornerstone in contemporary generative modeling. While their practical power has now been widely recognized, the theoretical underpinnings remain far from mature. In this talk, I will introduce a suite of non-asymptotic theory towards understanding the data generation process of diffusion models in discrete time, assuming access to reliable estimates of the (Stein) score functions. For a popular deterministic sampler (based on the probability flow ODE ), we establish a convergence rate proportional to $1/T$ (with $T$ the total number of steps), improving upon past results; for another mainstream stochastic sampler (i.e., a type of the denoising diffusion probabilistic model (DDPM)), we derive a convergence rate proportional to $1/\sqrt{T}$, matching the state-of-the-art theory. We will also discuss novel training-free algorithms to accelerate these samplers. We design two accelerated variants, improving the convergence to $1/T^2$ for the ODE -based sampler and $1/T$ for the DDPM -type sampler, which might be of independent theoretical and empirical interest.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity