BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Applied and Computational Analysis
SUMMARY:Machine Learning and Dynamical Systems Meet in Rep
roducing Kernel Hilbert Spaces with Insights from
Algorithmic Information Theory - Boumediene Hamzi
DTSTART;TZID=Europe/London:20240425T150000
DTEND;TZID=Europe/London:20240425T160000
UID:TALK215533AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/215533
DESCRIPTION:Since its inception in the 19th century\, through
the efforts of Poincaré and Lyapunov\, the theory
of dynamical systems has addressed the qualitative
behavior of systems as understood from models. Fr
om this perspective\, modeling dynamical processes
in applications demands a detailed understanding
of the processes to be analyzed. This understandin
g leads to a model\, which approximates observed r
eality and is often expressed by a system of ordin
ary/partial\, underdetermined (control)\, determin
istic/stochastic differential or difference equati
ons. While these models are very precise for many
processes\, for some of the most challenging appli
cations of dynamical systems\, such as climate dyn
amics\, brain dynamics\, biological systems\, or f
inancial markets\, developing such models is notab
ly difficult. On the other hand\, the field of mac
hine learning is concerned with algorithms designe
d to accomplish specific tasks\, whose performance
improves with more data input. Applications of ma
chine learning methods include computer vision\, s
tock market analysis\, speech recognition\, recomm
ender systems\, and sentiment analysis in social m
edia. The machine learning approach is invaluable
in settings where no explicit model is formulated\
, but measurement data are available. This is ofte
n the case in many systems of interest\, and the d
evelopment of data-driven technologies is increasi
ngly important in many applications. The intersect
ion of the fields of dynamical systems and machine
learning is largely unexplored\, and the objectiv
e of this talk is to show that working in reproduc
ing kernel Hilbert spaces offers tools for a data-
based theory of nonlinear dynamical systems.\n\nIn
the first part of the talk\, we introduce simple
methods to learn surrogate models for complex syst
ems. We present variants of the method of Kernel F
lows as simple approaches for learning the kernel
that appear in the emulators we use in our work. F
irst\, we will discuss the method of parametric an
d nonparametric kernel flows for learning chaotic
dynamical systems. We’ll also explore learning dyn
amical systems from irregularly sampled time serie
s and from partial observations. We will introduce
the methods of Sparse Kernel Flows and Hausdorff-
metric based Kernel Flows (HMKFs) and apply them t
o learn 132 chaotic dynamical systems. We draw par
allels between Minimum Description Length (MDL) an
d Regularization in Machine Learning (RML)\, showc
asing that the method of Sparse Kernel Flows offer
s a natural approach to kernel learning. By consid
ering code lengths and complexities rooted in Algo
rithmic Information Theory (AIT)\, we demonstrate
that data-adaptive kernel learning can be achieved
through the MDL principle\, bypassing the need fo
r cross-validation as a statistical method. Finall
y\, we extend the method of Kernel Mode Decomposit
ion to design kernels in view of detecting critica
l transitions in some fast-slow random dynamical s
ystems.\n\nThen\, we introduce a data-based approa
ch to estimating key quantities which arise in the
study of nonlinear autonomous\, control\, and ran
dom dynamical systems. Our approach hinges on the
observation that much of the existing linear theor
y may be readily extended to nonlinear systems – w
ith a reasonable expectation of success - once the
nonlinear system has been mapped into a high or i
nfinite dimensional Reproducing Kernel Hilbert Spa
ce. We develop computable\, non-parametric estimat
ors approximating controllability and observabilit
y energies for nonlinear systems. We apply this ap
proach to the problem of model reduction of nonlin
ear control systems. It is also shown that the con
trollability energy estimator provides a key means
for approximating the invariant measure of an erg
odic\, stochastically forced nonlinear system. Fin
ally\, we show how kernel methods can be used to a
pproximate center manifolds\, propose a data-based
version of the center manifold theorem\, and cons
truct Lyapunov functions for nonlinear ODEs.\n
LOCATION:Centre for Mathematical Sciences\, MR2
CONTACT:Matthew Colbrook
END:VEVENT
END:VCALENDAR