University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > On ResNet type neural network architectures and their stability properties

On ResNet type neural network architectures and their stability properties

Add to your list(s) Download to your calendar using vCal

  • UserBrynjulf Owren (Norwegian University of Science and Technology, Norwegian University of Science and Technology)
  • ClockTuesday 02 November 2021, 09:00-10:00
  • HouseSeminar Room 2, Newton Institute.

If you have a question about this talk, please contact nobody.

MDL - Mathematics of deep learning

The study of neural networks as numerical approximations to continuous optimal control problems has gained some popularity after they were introduced by E (2017) and discussed in several other papers. This viewpoint paves the way for analysing the networks as dynamical systems and in particular to address their stability properties. This was the topic in a paper by Haber and Ruthotto (2017).
In this talk we shall first introduce the continuous optimal control approach based on the presentation of Benning et al (2019) and discuss briefly what kind of advantages this viewpoint can offer. Next we will present some stability results  inspired from the literature on the numerical solution of ordinary differential equations. This involves in particular the use of continuous non-expansive models and their numerical approximations, see Celledoni et al. (2021). Finally, we will talk briefly about some ongoing work using switching systems to analyse and control the stability of a mixed type neural network architecture.

  • E, W. (2017). A proposal on machine learning via dynamical systems. Commun. Math. Stat. 5(1), 1–11
  • Haber, E. & Ruthotto, L. (2017). Stable architectures for deep neural networks. Inverse Probl. 34(1)
  • Benning, M., Celledoni, E., Ehrhardt, M. J., Owren, B. & Schönlieb, C.-B. (2019) Deep learning as optimal control problems: models and numerical methods. J. Comput. Dyn. 6(2), 171–198.
  • E. Celledoni, M. J. Ehrhardt, C. Etmann, R. I. McLachlan, B. Owren, C.-B. Schonlieb and F. Sherry (2021). Structure preserving deep learning. European Journal of Applied Mathematics, 32(5), 888-936. doi:10.1017/S0956792521000139

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity