COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

## PAC BayesAdd to your list(s) Download to your calendar using vCal - Alex Matthews; Nikola Mrksic
- Thursday 27 November 2014, 15:00-16:30
- Engineering Department, CBL Room 438.
If you have a question about this talk, please contact mv310. The talk will be composed of two parts. The first part will be an introduction to PAC learning. The second will be an introduction to PAC -Bayes. Probably Approximately Correct learning (PAC learning) is a framework for the mathematical analysis of machine learning, proposed in 1984 by Leslie Valiant. The framework introduces computational complexity theory concepts to machine learning, expecting the learner to find efficient functions (polynomial time and space) using a polynomial learning procedure as well. PAC learning gave rise to the field of computational learning theory, whose primary goal is to compare the power of different learning models. This talk will introduce the basic concepts and present some of the results obtained using PAC learning and VC theory. Since this part of the talk is tutorial in nature no reading will be required. PAC -Bayes is a PAC like framework where the generalization error bounds are derived using a reference distribution chosen before seeing the data. The bounds are often very tight relative to other types of PAC bound. There are intimate connections to the Bayesian view of learning though the two theories are not identical. In this talk we will give a tutorial on the basic concepts of PAC -Bayes before discussing the now classic application to Gaussian process classification. This part of the talk is tutorial in nature and relatively self contained apart from some knowledge of Gaussian processes which will be assumed. An idea of the content of the talk can be gained by looking at Seeger’s work: http://www.jmlr.org/papers/volume3/seeger02a/seeger02a.pdf but a detailed understanding of this paper is certainly not essential to learn from the talk. This talk is part of the Machine Learning Reading Group @ CUED series. ## This talk is included in these lists:- All Talks (aka the CURE list)
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge University Engineering Department Talks
- Cambridge talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- Engineering Department, CBL Room 438
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- ML
- Machine Learning Reading Group
- Machine Learning Reading Group @ CUED
- Machine Learning Summary
- Quantum Matter Journal Club
- Required lists for MLG
- School of Technology
- Simon Baker's List
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- bld31
- custom
- ndk22's list
- ob366-ai4er
- rp587
- yk373's list
Note that ex-directory lists are not shown. |
## Other listsSociology of Intellectuals Reading Group Seminar Qualitative Research Forum - Open meetings Lattice field theory informal seminars Making Refuge: Creative Responses to the Refugee Crisis Graduate Programme in Cognitive and Brain Sciences CMS Special Lectures## Other talksReligion, revelry and resistance in Jacobean Lancashire Cycloadditions via TMM-Pd Intermediates: New Strategies for Asymmetric Induction and Total Synthesis Metamaterials and the Science of Invisibility Fukushima and the Law |