Statistical guarantees for neural operator surrogates
- đ¤ Speaker: Sven Wang (EPFL)
- đ Date & Time: Friday 14 November 2025, 14:00 - 15:00
- đ Venue: MR12, Centre for Mathematical Sciences
Abstract
In recent years, “operator learning” methodologies for constructing data-driven surrogates for non-linear operators are gaining widespread attention. We present statistical convergence results for the learning of such non-linear mappings in infinite-dimensional spaces, e.g. arising from PDEs, given noisy input-output pairs. We provide convergence results for least-squares-type empirical risk minimizers over general classes, in terms of their approximation properties and metric entropy bounds. This generalizes classical results from finite-dimensional nonparametric regression to an infinite-dimensional setting.
Assuming $G_0$ to be holomorphic, we prove algebraic (in the sample size $n$) convergence rates in this setting, thereby overcoming the curse of dimensionality. To illustrate the wide applicability, as a prototypical example we discuss the learning of the non-linear solution operator to a parametric elliptic partial differential equation, with an encoder-decoder based neural operator architecture.
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, Centre for Mathematical Sciences
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Friday 14 November 2025, 14:00-15:00