University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > How close are these distributions? A brief introduction to statistical distances and divergences.

How close are these distributions? A brief introduction to statistical distances and divergences.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Elre Oldewage.

The question of whether two probability distributions are `close’ to each other arises in many contexts in statistics and machine learning, including hypothesis testing, approximate Bayesian inference and generative modelling. However, what it means for two probability distributions to be `close’ is somewhat subtle. In this reading group, we will give an overview of some of the different ways of measuring distance between probability distributions, as well as relationships between these notions of distance.

Required Reading: None.

References and Further (Optional) Reading:

  1. On Integral Probability Metrics, φ-Divergences and Binary Classification. Bharath K. Sriperumbudur, Kenji Fukumizu, Arthur Gretton, Bernhard Schölkopf and Gert R.G. Lanckrie. 2009.
  2. A Kernel Two-Sample Test. Arthur Gretton, Karsten M. Borgwardt, Malte J. Rasch, Bernhard Schölkopf, Alexander Smola.
  3. Lecture Notes on Information Theory. Yury Polyanskiy and Yihong Wu. http://www.stat.yale.edu/~yw562/teaching/itlectures.pdf. (Particularly chapter 6 and 7 on f-divergences).
  4. Optimal transport, old and new. Cedric Villani. Chapter 6.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity