University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Kernel Mean Embeddings

Kernel Mean Embeddings

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact jg801.

Kernel mean embeddings are a technique that represents probability distributions as elements in a high dimensional Hilbert space. This allows difficult operations such as expectations to be reformulated as dot products in the Hilbert space. This talk introduces reproducing kernel Hilbert spaces (RKHS), which is the theoretical underpinnings upon which kernel mean embeddings are constructed. The talk aims to provide intuition regarding the relationship between kernels, feature maps and RHKS function spaces. The talk continues with a discussion of kernel mean embeddings, and common applications of kernel mean embeddings, specifically, kernel two-sample hypothesis testing. We also discuss embedding multivariable conditional distributions which allow the application of kernel Bayes rule.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity