Talks.cam will close on 1 July 2026, further information is available on the UIS Help Site
 

University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > A Tutorial on Algorithmic Information Theory in Modern ML

A Tutorial on Algorithmic Information Theory in Modern ML

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Xianda Sun.

This tutorial explores how ideas from algorithmic information theory connect to modern machine learning through three recent papers. We begin with Solomonoff induction—the theoretically optimal but uncomputable predictor—and show how neural networks can approximate it by training on Universal Turing Machine data (Grau-Moya et al., 2024). We then establish the formal foundations by examining Kolmogorov complexity and its connections to compression and randomness in images, exploring how the Solomonoff prior helps us understand what makes images “realistic” and guides the design of better generative models and anomaly detectors (Theis, 2024). Finally, we demonstrate these principles at scale, deriving non-vacuous generalization bounds for large language models with billions of parameters through compression-based analysis using the SubLoRA technique (Lotfi et al., 2024). No prior background in algorithmic information theory required—we’ll build intuition from first principles while connecting to familiar ML concepts throughout.

Papers:
  1. Learning Universal Predictors (Grau-Moya et al., 2024) – https://arxiv.org/abs/2401.14953
  2. What Makes an Image Realistic? (Theis, 2024) – https://arxiv.org/abs/2403.04493
  3. Non-Vacuous Generalization Bounds for Large Language Models (Lotfi et al., 2024) – https://arxiv.org/abs/2312.17173

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity