University of Cambridge > Talks.cam > Language Technology Lab Seminars > On KL divergence and beyond

On KL divergence and beyond

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Edoardo Maria Ponti.

Abstract: Many machine learning tasks require fitting a model to observed data, which is mostly done via divergence minimisation. In this talk I will start from the basics and discuss the celebrated Kullback-Leibler (KL) divergence and its applications in machine learning. Then I will discuss potential issues of KL divergence and motivates other divergence measures. I will show how f-divergence, a rich family of divergences that includes KL, is applied to machine learning tasks, in particular for approximate inference. If time permits, I will briefly touch on divergencies/discrepancies that are not density-ratio based, and discuss relavent applications.

This talk is part of the Language Technology Lab Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity