University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians.

Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

MDL - Mathematics of deep learning

For artificial deep neural networks with ReLU activation,we prove new expression rate bounds forparametric, analytic functions wherethe parameter dimension could be infinite.Approximation rates are in mean square on the unboundedparameter range with respect to product gaussian measure.Approximation rate bounds are free from the CoD, anddetermined by summability of Wiener-Hermite PC expansion coefficients.Sufficient conditions for summability are quantified holomorphyon products of strips in the complex domain.Applications comprise DNN expression rate bounds of deep-NNsfor response surfaces of elliptic PDEs with log-gaussianrandom field inputs, and for the posterior densities of thecorresponding Bayesian inverse problems.Variants of proofs which are constructive are outlined.(joint work with Jakob Zech, University of Heidelberg, Germany, and with Dinh Dung and Nguyen Van Kien, Hanoi, Vietnam)References:https://math.ethz.ch/sam/research/reports.html?id=982

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2021 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity