University of Cambridge > Talks.cam > Applied and Computational Analysis > Machine Learning and Dynamical Systems Meet in Reproducing Kernel Hilbert Spaces with Insights from Algorithmic Information Theory

Machine Learning and Dynamical Systems Meet in Reproducing Kernel Hilbert Spaces with Insights from Algorithmic Information Theory

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Matthew Colbrook.

Since its inception in the 19th century, through the efforts of Poincaré and Lyapunov, the theory of dynamical systems has addressed the qualitative behavior of systems as understood from models. From this perspective, modeling dynamical processes in applications demands a detailed understanding of the processes to be analyzed. This understanding leads to a model, which approximates observed reality and is often expressed by a system of ordinary/partial, underdetermined (control), deterministic/stochastic differential or difference equations. While these models are very precise for many processes, for some of the most challenging applications of dynamical systems, such as climate dynamics, brain dynamics, biological systems, or financial markets, developing such models is notably difficult. On the other hand, the field of machine learning is concerned with algorithms designed to accomplish specific tasks, whose performance improves with more data input. Applications of machine learning methods include computer vision, stock market analysis, speech recognition, recommender systems, and sentiment analysis in social media. The machine learning approach is invaluable in settings where no explicit model is formulated, but measurement data are available. This is often the case in many systems of interest, and the development of data-driven technologies is increasingly important in many applications. The intersection of the fields of dynamical systems and machine learning is largely unexplored, and the objective of this talk is to show that working in reproducing kernel Hilbert spaces offers tools for a data-based theory of nonlinear dynamical systems.

In the first part of the talk, we introduce simple methods to learn surrogate models for complex systems. We present variants of the method of Kernel Flows as simple approaches for learning the kernel that appear in the emulators we use in our work. First, we will discuss the method of parametric and nonparametric kernel flows for learning chaotic dynamical systems. We’ll also explore learning dynamical systems from irregularly sampled time series and from partial observations. We will introduce the methods of Sparse Kernel Flows and Hausdorff-metric based Kernel Flows (HMKFs) and apply them to learn 132 chaotic dynamical systems. We draw parallels between Minimum Description Length (MDL) and Regularization in Machine Learning (RML), showcasing that the method of Sparse Kernel Flows offers a natural approach to kernel learning. By considering code lengths and complexities rooted in Algorithmic Information Theory (AIT), we demonstrate that data-adaptive kernel learning can be achieved through the MDL principle, bypassing the need for cross-validation as a statistical method. Finally, we extend the method of Kernel Mode Decomposition to design kernels in view of detecting critical transitions in some fast-slow random dynamical systems.

Then, we introduce a data-based approach to estimating key quantities which arise in the study of nonlinear autonomous, control, and random dynamical systems. Our approach hinges on the observation that much of the existing linear theory may be readily extended to nonlinear systems – with a reasonable expectation of success – once the nonlinear system has been mapped into a high or infinite dimensional Reproducing Kernel Hilbert Space. We develop computable, non-parametric estimators approximating controllability and observability energies for nonlinear systems. We apply this approach to the problem of model reduction of nonlinear control systems. It is also shown that the controllability energy estimator provides a key means for approximating the invariant measure of an ergodic, stochastically forced nonlinear system. Finally, we show how kernel methods can be used to approximate center manifolds, propose a data-based version of the center manifold theorem, and construct Lyapunov functions for nonlinear ODEs.

This talk is part of the Applied and Computational Analysis series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity