University of Cambridge > Talks.cam > Statistics > Entropy contraction of the Gibbs sampler under log-concavity

Entropy contraction of the Gibbs sampler under log-concavity

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Qingyuan Zhao.

In this talk I will present recent work (https://arxiv.org/abs/2410.00858) on the non-asymptotic analysis of the Gibbs sampler, which is a canonical and popular Markov chain Monte Carlo algorithm for sampling. In particular, under the assumption that the probability measure π of interest is strongly log-concave, we show that the random scan Gibbs sampler contracts in relative entropy and provide a sharp characterization of the associated contraction rate. The result implies that, under appropriate conditions, the number of full evaluations of π required for the Gibbs sampler to converge is independent of the dimension. If time permits, I will also discuss connections and applications of the above results to the problem of zero-order parallel sampling.

Based on joint work with Filippo Ascolani and Hugo Lavenant.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity