BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Rates of convergence to stationarity with multipli
cative noise: from stochastic reflection to denois
ing diffusions in generative modelling. - Miha Bre
sar (University of Warwick)
DTSTART;TZID=Europe/London:20240808T143000
DTEND;TZID=Europe/London:20240808T150000
UID:TALK219679AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/219679
DESCRIPTION:We characterise the rate of convergence of three c
lasses of ergodic diffusions with multiplicative n
oise: (1) Reflected processes in generalised parab
olic domains\; (2) tempered Langevin diffusions\;
(3) forward processes in denoising diffusions of g
enerative modelling.\nIn this talk we show that th
ese disparate looking models share the same underl
ying structure\, which determines their respective
rates of convergence to stationarity.\n(1) Varyin
g multiplicative noise in reflecting diffusions le
ads to wildly different convergence rates to stati
onarity.\n(2) Tempered Langevin diffusions with fi
xed invariant measure exhibit significant improvem
ents in the rates of convergence to stationarity w
ith the addition of unbounded multiplicative noise
.\n(3) We describe denoising diffusion probabilist
ic models (DDPMs) representing a recent advancemen
t in generative AI\, used in platforms such as Cha
tGPT. DDPMs are contingent on the convergence to s
tationarity of an underlying forward diffusion\, i
nitialised at a high dimensional multi-modal data
distribution. We establish a cut-off phenomenon fo
r the convergence of the Ornstein-Uhlenbeck forwar
d process used in the practical applications of DD
PMs and compare it with a large class of diffusion
s with multiplicative noise. Our results prove tha
t in this case\, unlike (1) and (2)\, the OU proce
ss (with additive noise) is hard to beat. \;
LOCATION:External
CONTACT:
END:VEVENT
END:VCALENDAR