COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Sampling with kernelized Wasserstein gradient flows

## Sampling with kernelized Wasserstein gradient flowsAdd to your list(s) Download to your calendar using vCal - Anna Korba (ENSAE (École Nationale de la Statistique et de l'Administration))
- Friday 29 April 2022, 09:00-10:00
- Seminar Room 1, Newton Institute.
If you have a question about this talk, please contact nobody. FKTW03 - Frontiers in kinetic equations for plasmas and collective behaviour Sampling from a probability distribution whose density is only known up to a normalisation constant is a fundamental problem in statistics and machine learning. Recently, several algorithms based on interactive particle systems were proposed for this task, as an alternative to Markov Chain Monte Carlo methods or Variational Inference. These particle systems can be designed by adopting an optimisation point of view for the sampling problem: an optimisation objective is chosen (which typically measures the dissimilarity to the target distribution), and its Wasserstein gradient flow is approximated by an interacting particle system. At stationarity, the stationarity states of these particle systems define an empirical measure approximating the target distribution. In this talk I will present recent work on such algorithms, such as Stein Variational Gradient Descent [1] or Kernel Stein Discrepancy Descent [2], two algorithms based on Wasserstein gradient flows and reproducing kernels. I will discuss some recent results, that show that these particle systems can provide a good approximation of the target distribution; as well as current issues and open questions on the empirical and theoretical side. non-asymptotic Analysis of Stein Variational Gradient Descent. Korba, A., Salim, A., Arbel, M., Luise, G., Gretton, A. Neural Information Processing Systems (Neurips), 2020 [2] Kernel Stein Discrepancy Descent. Korba, A., Aubin-Frankowski, P.C., Majewski, S., Ablin, P. International Conference of Machine Learning (ICML), 2021. This talk is part of the Isaac Newton Institute Seminar Series series. ## This talk is included in these lists:- All CMS events
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
- bld31
Note that ex-directory lists are not shown. |
## Other listsMathematical Modeling How to fix-Pname Com Facebook Orca Error? Profitable business investment proposal, notify me if interested## Other talksSensitivity of Antarctic shelf waters to wind amplitude and meltwater Epigenetics and genome dynamics: what can we learn from ciliates? Gateway Designing the BEARS (Both Ears) Virtual Reality Training Package to Improve Spatial Hearing in Young People with Bilateral Cochlear Implant Mini course: Spectral Theory on the Fractals: Lecture 5 Continuous hierarchical models. Localization theorem for the class of the random potentials of the finite rank. Fractional-order PID Control |