Talks.cam will close on 1 July 2026, further information is available on the UIS Help Site
 

University of Cambridge > Talks.cam > Statistics > Statistical guarantees for neural operator surrogates

Statistical guarantees for neural operator surrogates

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Qingyuan Zhao.

In recent years, “operator learning” methodologies for constructing data-driven surrogates for non-linear operators are gaining widespread attention. We present statistical convergence results for the learning of such non-linear mappings in infinite-dimensional spaces, e.g. arising from PDEs, given noisy input-output pairs. We provide convergence results for least-squares-type empirical risk minimizers over general classes, in terms of their approximation properties and metric entropy bounds. This generalizes classical results from finite-dimensional nonparametric regression to an infinite-dimensional setting.

Assuming $G_0$ to be holomorphic, we prove algebraic (in the sample size $n$) convergence rates in this setting, thereby overcoming the curse of dimensionality. To illustrate the wide applicability, as a prototypical example we discuss the learning of the non-linear solution operator to a parametric elliptic partial differential equation, with an encoder-decoder based neural operator architecture.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity