COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians.

## Deep Learning in High Dimension: Neural Network Approximation of Analytic Maps of Gaussians.Add to your list(s) Download to your calendar using vCal - Christoph Schwab (ETH Zürich)
- Wednesday 01 December 2021, 17:00-18:30
- Seminar Room 2, Newton Institute.
If you have a question about this talk, please contact nobody. MDL - Mathematics of deep learning For artificial deep neural networks with ReLU activation,we prove new expression rate bounds forparametric, analytic functions wherethe parameter dimension could be infinite.Approximation rates are in mean square on the unboundedparameter range with respect to product gaussian measure.Approximation rate bounds are free from the CoD, anddetermined by summability of Wiener-Hermite PC expansion coefficients.Sufficient conditions for summability are quantified holomorphyon products of strips in the complex domain.Applications comprise DNN expression rate bounds of deep-NNsfor response surfaces of elliptic PDEs with log-gaussianrandom field inputs, and for the posterior densities of thecorresponding Bayesian inverse problems.Variants of proofs which are constructive are outlined.(joint work with Jakob Zech, University of Heidelberg, Germany, and with Dinh Dung and Nguyen Van Kien, Hanoi, Vietnam)References:https://math.ethz.ch/sam/research/reports.html?id=982 This talk is part of the Isaac Newton Institute Seminar Series series. ## This talk is included in these lists:- All CMS events
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 2, Newton Institute
- bld31
Note that ex-directory lists are not shown. |
## Other listsMRC Cognition and Brain Sciences Unit- Chaucer Club EPOC talks Cambridge Classical Reception Seminar Series## Other talksMDL Reading Group - Overparameterized neural networks implement associative memory Should Cats and Dogs go Vegan? Statistics Clinic Michaelmas 2021 V Why is the suprachiasmatic nucleus such a brilliant circadian time-keeper? An Introduction to Federated Learning and its Applications in Medicine Gateway |