On KL divergence and beyond
- đ¤ Speaker: Yingzhen Li, Microsoft Research Cambridge đ Website
- đ Date & Time: Thursday 01 November 2018, 11:00 - 12:00
- đ Venue: Boardroom, Faculty of English, West Road
Abstract
Abstract: Many machine learning tasks require fitting a model to observed data, which is mostly done via divergence minimisation. In this talk I will start from the basics and discuss the celebrated Kullback-Leibler (KL) divergence and its applications in machine learning. Then I will discuss potential issues of KL divergence and motivates other divergence measures. I will show how f-divergence, a rich family of divergences that includes KL, is applied to machine learning tasks, in particular for approximate inference. If time permits, I will briefly touch on divergencies/discrepancies that are not density-ratio based, and discuss relavent applications.
Series This talk is part of the Language Technology Lab Seminars series.
Included in Lists
- bld31
- Boardroom, Faculty of English, West Road
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- Interested Talks
- Language Sciences for Graduate Students
- Language Technology Lab Seminars
- ndk22's list
- ob366-ai4er
- rp587
- Simon Baker's List
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Yingzhen Li, Microsoft Research Cambridge 
Thursday 01 November 2018, 11:00-12:00