An Introduction to PAC-Bayes
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Elre Oldewage.
PAC -Bayes is a frequentist framework for obtaining generalisation error bounds. It has been used to derive learning algorithms, provide explanations for generalisation in deep learning, and form connections between Bayesian and frequentist inference. This reading group will cover a broad introduction to PAC bounds, the proof ideas in PAC -Bayes, and a discussion of some recent applications.
Suggested reading:
- Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data: https://arxiv.org/abs/1703.11008
- PAC -Bayesian Theory Meets Bayesian Inference: https://arxiv.org/abs/1605.08636
- Learning under Model Misspecification: Applications to Variational and Ensemble Methods: https://arxiv.org/abs/1912.08335
This talk is part of the Machine Learning Reading Group @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|