University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d)$

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d)$

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

MDLW03 - Deep learning and partial differential equations

For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}d \to \mathbb{R}$ in $L2(\mathbb{R}d)$ where the dimension $d$ could be infinite, and where $L2$ is with respect to gaussian measure.  We consider $\mbox{ReLU}$ and $\mbox{ReLU}k$ activations for integer $k\geq 2$.  In the infinite-dimensional case, under suitable smoothness and sparsity assumptions on $f:\mathbb{R}{\mathbb{N}}\to \mathbb{R}$, with $\gamma_\infty$ denoting an infinite (Gaussian) product measure on $(\mathbb{R}}, {\mathcal B}(\mathbb{R}{\mathbb{N}}))$, we prove dimension-independent  DNN expression rate bounds in the norm $L2(\mathbb{R}{\mathbb{N}} , \gamma_\infty)$. The DNN expression rates are not subject to the CoD, and depend on summability of Wiener-Hermite expansion coefficients of $f$. Sufficient conditions are quantified holomorphy of (an analytic continuation of) the map $f$ on a product of strips in the complex domain. As application, we prove DNN expression rate bounds of deep $\mbox{ReLU}$-NNs for response surfaces of elliptic PDEs with log-gaussian random field inputs. (joint work with Jakob Zech, University of Heidelberg, Germany)

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity