COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Statistics > Entropy contraction of the Gibbs sampler under log-concavity
Entropy contraction of the Gibbs sampler under log-concavityAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Qingyuan Zhao. In this talk I will present recent work (https://arxiv.org/abs/2410.00858) on the non-asymptotic analysis of the Gibbs sampler, which is a canonical and popular Markov chain Monte Carlo algorithm for sampling. In particular, under the assumption that the probability measure π of interest is strongly log-concave, we show that the random scan Gibbs sampler contracts in relative entropy and provide a sharp characterization of the associated contraction rate. The result implies that, under appropriate conditions, the number of full evaluations of π required for the Gibbs sampler to converge is independent of the dimension. If time permits, I will also discuss connections and applications of the above results to the problem of zero-order parallel sampling. Based on joint work with Filippo Ascolani and Hugo Lavenant. This talk is part of the Statistics series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsThe making of AMPA-type glutamate receptors as instructed by their interactome Adrian Seminars in Neuroscience Emmy Noether Society talkOther talksOur archaeal origins Protein genetic architecture is simple, and epistasis can facilitate the evolution of new functions Lunch at Churchill College Back to the Building Blocks Verifying a Concurrent Hypervisor in C++ Proving Information Flow Security for Concurrent Programs |