BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Statistical and geometrical properties of regularized kernel Kullb
 ack-Leibler divergence - Anna Korba (ENSAE (École Nationale de la Statist
 ique et de l'Administration))
DTSTART:20250508T143000Z
DTEND:20250508T153000Z
UID:TALK230491@talks.cam.ac.uk
DESCRIPTION:In this work\, we study the statistical and geometrical proper
 ties of the Kullback-Leibler divergence with kernel covariance operators (
 KKL) introduced by [Bach\, 2022\, Information Theory with Kernel Methods].
  Unlike the classical Kullback-Leibler (KL) divergence that involves densi
 ty ratios\, the KKL compares probability distributions through covariance 
 operators (embeddings) in a reproducible kernel Hilbert space (RKHS)\, and
  compute the Kullback-Leibler quantum divergence. This novel divergence he
 nce shares parallel but different aspects with both the standard Kullback-
 Leibler between probability distributions and kernel embeddings metrics su
 ch as the maximum mean discrepancy. A limitation faced with the original K
 KL divergence is its inability to be defined for distributions with disjoi
 nt supports. To solve this problem\, we propose in this paper a regularise
 d variant that guarantees that divergence is well defined for all distribu
 tions. We derive bounds that quantify the deviation of the regularised KKL
  to the original one\, as well as concentration bounds. In addition\, we p
 rovide a closed-form expression for the regularised KKL\, specifically app
 licable when the distributions consist of finite sets of points\, which ma
 kes it implementable. Furthermore\, we derive a Wasserstein gradient desce
 nt scheme of the KKL divergence in the case of discrete distributions\, an
 d study empirically its properties to transport a set of points to a targe
 t distribution.
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
