University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > The Pervasive Role of Composing Transformations in Machine Learning

The Pervasive Role of Composing Transformations in Machine Learning

Add to your list(s) Download to your calendar using vCal

  • UserAnders Karlsson (Université de Genève)
  • ClockTuesday 22 July 2025, 14:50-15:40
  • HouseExternal.

If you have a question about this talk, please contact nobody.

OGGW05 - Geometric and combinatorial methods in the foundations of computer science and artificial intelligence

From the layer maps of neural networks to training procedures and reinforcement learning, compositions of transformations permeate modern AI. These compositional products often involve randomly selected maps, as in weight initialization, stochastic gradient descent (SGD), and dropout. In reinforcement learning, Bellman-type operators with randomness are iterated to update reward structures and strategies. I will discuss the mathematics and geometry underlying the composition of random transformations. In particular, I will explain a general limit law established in joint work with Gouëzel. Moreover, I will discuss a possible cut-off phenomenon related to the depth of neural networks and the influence of iteration order. Motivated by these observations, and in collaboration with Avelin, Dherin, Gonzalvo, Mazzawi, and Munn, we propose backward variants of SGD that improve stability and convergence while maintaining generalisation. 

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity