University of Cambridge > > Isaac Newton Institute Seminar Series > Compositional Features and Feedforward Neural Networks for High Dimensional Problems

Compositional Features and Feedforward Neural Networks for High Dimensional Problems

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

MDLW03 - Deep learning and partial differential equations

Deep learning has had many impressive empirical successes in science and industries. On the other hand, the lack of theoretical understanding of the field has been a large barrier to the adoption of the technology. In this talk, I will discuss some compositional features of high dimensional problems and their mathematical properties that shed light on the question why deep learning works for high dimensional problems. It is widely observed in science and engineering that complicated and high dimensional information input-output relations can be represented as compositions of functions with low input dimensions. Their compositional structures can be effectively represented using layered directed acyclic graphs (layered DAGs). Based on the layered DAG formulation, an algebraic framework and approximation theory are developed for compositional functions including neural networks. The theory leads to the proof of several complexity/approximation error bounds of deep neural networks for problems of regression and dynamical systems.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity