University of Cambridge > > Statistics > On Independent Samples along the Langevin Dynamics and Algorithm

On Independent Samples along the Langevin Dynamics and Algorithm

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr Sergio Bacallado.

Sampling from a probability distribution is a fundamental algorithmic task, and one way to do that is via running a Markov chain. The mixing time of a Markov chain characterizes how long we should run the Markov chain until the random variable converges to the stationary distribution. In this talk, we discuss the “independence time”, which is how long we should run a Markov chain until the initial and final random variables are approximately independent, in the sense that they have small mutual information. We study this question for two natural Markov chains: the Langevin dynamics in continuous time, and the Unadjusted Langevin Algorithm in discrete time. When the target distribution is strongly log-concave, we prove that the mutual information between the initial and final random variables decreases exponentially fast along both Markov chains. These convergence rates are tight, and lead to an estimate of the independence time which is similar to the mixing time guarantees of these Markov chains. We illustrate our proofs using the strong data processing inequality and the regularity properties of Langevin dynamics. Based on joint work with Jiaming Liang and Siddharth Mitra,

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity