University of Cambridge > Talks.cam > Machine Learning @ CUED > Learning with nonparametric dependence and divergence estimation

Learning with nonparametric dependence and divergence estimation

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

Estimation of dependencies and divergences are among the fundamental problems of statistics and machine learning. While information theory provides standard measures for them (e.g. Shannon mutual information, Kullback-Leibler divergence), it is still unknown how to estimate these quantities in the most efficient way. We could use density estimators, but in high-dimensional domains they are known to suffer from the curse of dimensionality. Therefore, it is of great importance to know which functionals of densities can be estimated efficiently in a direct way, without estimating the density. Using tools from Euclidean random graph optimization, copula transformation, and reproducing kernel Hilbert spaces, we will discuss consistent dependence and divergence estimators that avoid density estimation. These estimators allow us to generalize classification, regression, anomaly detection, low-dimensional embedding, and other machine learning algorithms to the space of sets and distributions. We demonstrate the power of our methods by beating the best published results on several computer vision and independent component analysis benchmarks. We also show how our perspective on learning from distributions allows us to define new analyses in astronomy and fluid dynamics simulations.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity