University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Bregman Divergences and Machine Learning

Bregman Divergences and Machine Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Shakir Mohamed.

In this RCC we will look at Bregman Divergences. I will describe the Bregman divergence and its properties and examine their relationship with exponential family distributions. I will also look at algorithms using Bregman divergences for clustering and other applications.

The discussion in mainly based on the paper: “Clustering with Bregman Divergences”, JMLR 6 , 2005 http://jmlr.csail.mit.edu/papers/volume6/banerjee05b/banerjee05b.pdf

Other related papers that I will reference include: “Matrix Nearness Problems with Bregman Divergences” http://authors.library.caltech.edu/9428/

“Learning Continuous Latent Variable Models with Bregman Divergences” http://www.springerlink.com/index/EHVDW71HTL8E05B3.pdf

“A Generalization of Principal Components to the Exponential Family” http://books.nips.cc/papers/files/nips14/AA27.pdf

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity