COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Information Theory Seminar > Symmetrized KL information: Channel Capacity and Learning Theory
Symmetrized KL information: Channel Capacity and Learning TheoryAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Prof. Ramji Venkataramanan. This talk explores the interesting applications of symmetrized KL information across information theory and machine learning. We begin by introducing this information measure and its key properties. We then demonstrate its utility in deriving the upper bound for Poisson channel capacity. Moving to learning theory, we show how symmetrized KL information provides novel insights into generalization error analysis. In particular, we present an exact characterization of the Gibbs algorithm ( a.k.a. Gibbs Posterior) in supervised learning using this measure, followed by novel upper bound on its generalization error. The talk concludes with an application to one-hidden-layer neural networks, where we leverage symmetrized KL divergence to establish generalization bounds for one-hidden-layer neural networks in the mean-field regime. This talk is part of the Information Theory Seminar series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsEarly Modern Economic and Social History Seminars Department of Geography - Book launch MRC Epidemiology and CEDAR SeminarsOther talksGroup Work Rigidity of the extremal Kerr-Newman horizon Perturbations of Fefferman spaces over (almost) CR manifolds Dynamics of Ions in Confinement The Parke-Taylor formula in higher-dimensions TBA |