University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Divergence measures and message passing - CANCELLED

Divergence measures and message passing - CANCELLED

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Shakir Mohamed.

Canceled

Unfortunately I’ve picked up a horrible fluey thing so I am cancelling. Apologies to all.

This paper by Tom Minka presents a unifying view of message-passing algorithms, as methods to approximate a complex Bayesian network by a simpler network with minimum information divergence. In this view, the difference between mean-field methods and belief propagation is not the amount of structure they model, but only the measure of loss they minimize: `exclusive’ versus `inclusive’ Kullback-Leibler divergence. Both these divergence measures can be viewed as examples of alpha-divergence for specific values of alpha. In each case, message-passing arises by minimizing a localized version of the divergence, local to each factor. By examining these divergence measures, we can intuit the types of solution they prefer (symmetry-breaking, for example) and their suitability for different tasks. Furthermore, by considering a wider variety of divergence measures (such as alpha-divergences), we can achieve different complexity and performance goals.

ftp://ftp.research.microsoft.com/pub/tr/TR-2005-173.pdf

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity