University of Cambridge > Talks.cam > Information Theory Seminar > Symmetrized KL information: Channel Capacity and Learning Theory

Symmetrized KL information: Channel Capacity and Learning Theory

Add to your list(s) Download to your calendar using vCal

  • UserDr Gholamali Aminian, Alan Turing Institute World_link
  • ClockWednesday 06 November 2024, 14:00-15:00
  • HouseMR5, CMS Pavilion A.

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

This talk explores the interesting applications of symmetrized KL information across information theory and machine learning. We begin by introducing this information measure and its key properties. We then demonstrate its utility in deriving the upper bound for Poisson channel capacity. Moving to learning theory, we show how symmetrized KL information provides novel insights into generalization error analysis. In particular, we present an exact characterization of the Gibbs algorithm ( a.k.a. Gibbs Posterior) in supervised learning using this measure, followed by novel upper bound on its generalization error. The talk concludes with an application to one-hidden-layer neural networks, where we leverage symmetrized KL divergence to establish generalization bounds for one-hidden-layer neural networks in the mean-field regime.

This talk is part of the Information Theory Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity