BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Language Technology Lab Seminars
SUMMARY:On KL divergence and beyond - Yingzhen Li\, Micros
 oft Research Cambridge
DTSTART;TZID=Europe/London:20181101T110000
DTEND;TZID=Europe/London:20181101T120000
UID:TALK114088AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/114088
DESCRIPTION:*Abstract:* Many machine learning tasks require fi
 tting a model to observed data\, which is mostly d
 one via divergence minimisation. In this talk I wi
 ll start from the basics and discuss the celebrate
 d Kullback-Leibler (KL) divergence and its applica
 tions in machine learning. Then I will discuss pot
 ential issues of KL divergence and motivates other
  divergence measures. I will show how f-divergence
 \, a rich family of divergences that includes KL\,
  is applied to machine learning tasks\, in particu
 lar for approximate inference. If time permits\, I
  will briefly touch on divergencies/discrepancies 
 that are not density-ratio based\, and discuss rel
 avent applications.
LOCATION:Boardroom\, Faculty of English\, West Road
CONTACT:Edoardo Maria Ponti
END:VEVENT
END:VCALENDAR
