University of Cambridge > Talks.cam > Information Theory Seminar > A Unified Framework for Change of Measure Inequalities: Applications to Generalization, Memorization and Privacy

A Unified Framework for Change of Measure Inequalities: Applications to Generalization, Memorization and Privacy

Download to your calendar using vCal

  • UserDr. Yanxiao Liu, Imperial College London Speaker website
  • ClockWednesday 13 May 2026, 14:00-15:00
  • HouseMR5, CMS Pavilion A.

If you have a question about this talk, please contact Ramji Venkataramanan.

We propose a novel class of change of measure inequalities via a unified framework based on the data processing inequality for f-divergences, which is surprisingly elementary yet powerful enough to yield tighter inequalities. We provide change of measure inequalities in terms of a broad family of information measures, including f-divergences (with Kullback-Leibler divergence and $\chi^2$-divergence as special cases), Renyi divergence, and $\alpha$-mutual information (with maximal leakage as a special case). A key advantage of our framework is its flexibility: it readily adapts to a range of settings, including the generalization error analyses, the conditional mutual information framework, PAC-Bayesian theory, differential privacy mechanisms and data memorization problem, with simplified analyses


This talk is part of the Information Theory Seminar series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Š 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity