BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Rates of convergence to stationarity with multiplicative noise: fr
 om stochastic reflection to denoising diffusions in generative modelling. 
 - Miha Bresar (University of Warwick)
DTSTART:20240808T133000Z
DTEND:20240808T140000Z
UID:TALK219679@talks.cam.ac.uk
DESCRIPTION:We characterise the rate of convergence of three classes of er
 godic diffusions with multiplicative noise: (1) Reflected processes in gen
 eralised parabolic domains\; (2) tempered Langevin diffusions\; (3) forwar
 d processes in denoising diffusions of generative modelling.\nIn this talk
  we show that these disparate looking models share the same underlying str
 ucture\, which determines their respective rates of convergence to station
 arity.\n(1) Varying multiplicative noise in reflecting diffusions leads to
  wildly different convergence rates to stationarity.\n(2) Tempered Langevi
 n diffusions with fixed invariant measure exhibit significant improvements
  in the rates of convergence to stationarity with the addition of unbounde
 d multiplicative noise.\n(3) We describe denoising diffusion probabilistic
  models (DDPMs) representing a recent advancement in generative AI\, used 
 in platforms such as ChatGPT. DDPMs are contingent on the convergence to s
 tationarity of an underlying forward diffusion\, initialised at a high dim
 ensional multi-modal data distribution. We establish a cut-off phenomenon 
 for the convergence of the Ornstein-Uhlenbeck forward process used in the 
 practical applications of DDPMs and compare it with a large class of diffu
 sions with multiplicative noise. Our results prove that in this case\, unl
 ike (1) and (2)\, the OU process (with additive noise) is hard to beat.&nb
 sp\;
LOCATION:External
END:VEVENT
END:VCALENDAR
