On KL divergence and beyond
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Edoardo Maria Ponti.
Abstract: Many machine learning tasks require fitting a model to observed data, which is mostly done via divergence minimisation. In this talk I will start from the basics and discuss the celebrated Kullback-Leibler (KL) divergence and its applications in machine learning. Then I will discuss potential issues of KL divergence and motivates other divergence measures. I will show how f-divergence, a rich family of divergences that includes KL, is applied to machine learning tasks, in particular for approximate inference. If time permits, I will briefly touch on divergencies/discrepancies that are not density-ratio based, and discuss relavent applications.
This talk is part of the Language Technology Lab Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|