BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:On KL divergence and beyond - Yingzhen Li\, Microsoft Research Cam
 bridge
DTSTART:20181101T110000Z
DTEND:20181101T120000Z
UID:TALK114088@talks.cam.ac.uk
CONTACT:Edoardo Maria Ponti
DESCRIPTION:*Abstract:* Many machine learning tasks require fitting a mode
 l to observed data\, which is mostly done via divergence minimisation. In 
 this talk I will start from the basics and discuss the celebrated Kullback
 -Leibler (KL) divergence and its applications in machine learning. Then I 
 will discuss potential issues of KL divergence and motivates other diverge
 nce measures. I will show how f-divergence\, a rich family of divergences 
 that includes KL\, is applied to machine learning tasks\, in particular fo
 r approximate inference. If time permits\, I will briefly touch on diverge
 ncies/discrepancies that are not density-ratio based\, and discuss relaven
 t applications.
LOCATION:Boardroom\, Faculty of English\, West Road
END:VEVENT
END:VCALENDAR
