Talks.cam will close on 1 July 2026, further information is available on the UIS Help Site
 

University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > On L2 posterior contraction rates in Bayesian nonparametric regression models

On L2 posterior contraction rates in Bayesian nonparametric regression models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

RCLW04 - Early Career Pioneers in Uncertainty Quantification and AI for Science

The nonparametric regression model with normal errors has been extensively studied, both from the frequentist and Bayesian viewpoint. A central result in Bayesian nonparametrics is that under assumptions on the prior, the data-generating distribution (assuming a true frequentist model) and a semi-metric d(.,.) on the space of regression functions that satisfy the so called testing condition, the posterior contracts around the true distribution with respect to d(.,.), and the rate of contraction can be estimated. In the regression setting, the semi-metric d(.,.) is often taken to be the Hellinger distance or the empirical L2 norm (i.e., the L2 norm with respect to the empirical distribution of the design) in the present regression context. However, extending contraction rates to the “integrated” L2 norm usually requires more work, and has previously been done for instance under sufficient smoothness or boundedness assumptions, which may not necessarily hold. In this work we show that, for priors based on truncated random basis expansions and in the random design setting, a high probability two sided inequality between the empirical L2 norm and the integrated L2 norm holds in appropriate spaces of functions of low frequencies, under mild assumptions on the underlying basis (which can be for instance a Fourier, wavelet or Laplace eigenfunction basis), allowing us to directly deduce an L2 contraction rate from an empirical L2 one without further assumption on the true regression function. Time allowing we will also discuss extensions to Gaussian process priors and semi supervised learning on graphs.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2026 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity