University of Cambridge > > CCIMI Seminars > Deep and reliable – Uncertainty quantification using Empirical Bayesian deep neural networks.

Deep and reliable – Uncertainty quantification using Empirical Bayesian deep neural networks.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Randolf Altmeyer.

Deep learning is a popular tool for making inferences, as it has good performance in many practical applications. Since data scientists use deep learning, we should give theoretical guarantees on the quality of the constructed estimator. For some applications, it is not enough to have theoretical guarantees on the estimation error, but, in addition, practitioners need to quantify uncertainty. To quantify that, a practitioner can construct confidence sets. Researchers have only recently started giving theoretical guarantees on the accuracy of deep learning. The construction of confidence statements is still an open problem. In this talk, I will first go over the general ideas and concepts, discussing some earlier proposed methods for uncertainty quantification before diving into my contribution. I introduce a new Bayesian methodology: Empirical Bayesian deep neural networks (EBDNN). EBDNN is the first methodology with theoretical guarantees: the uncertainty quantification produced is valid from a frequentist point of view. Moreover, EBDNN is much faster to compute than alternative methods proposed for uncertainty quantification. Joint work with Botond Szabó.

Join Zoom Meeting

Meeting ID: 994 1285 3967 Passcode: 026816

This talk is part of the CCIMI Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity