University of Cambridge > > Machine Learning Reading Group @ CUED > Bregman Divergences and Machine Learning

Bregman Divergences and Machine Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Shakir Mohamed.

In this RCC we will look at Bregman Divergences. I will describe the Bregman divergence and its properties and examine their relationship with exponential family distributions. I will also look at algorithms using Bregman divergences for clustering and other applications.

The discussion in mainly based on the paper: “Clustering with Bregman Divergences”, JMLR 6 , 2005

Other related papers that I will reference include: “Matrix Nearness Problems with Bregman Divergences”

“Learning Continuous Latent Variable Models with Bregman Divergences”

“A Generalization of Principal Components to the Exponential Family”

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity