University of Cambridge > > Statistics > Bayesian semiparametrics with Gaussian process priors

Bayesian semiparametrics with Gaussian process priors

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Richard Nickl.

In this talk we will first give a brief introduction to the Bayesian nonparametric approach. Given a model parametrized by an unknown function or high dimensional parameter, one puts an a priori probability distribution on it and studies the posterior distribution, that is the conditional distribution starting from the prior and given the data. Then we will move to the question of estimation in semiparametric models in a Bayesian way.

A semiparametric model typically consists of a parametric part of interest and a nonparametric part called nuisance. Putting a prior distribution on both, we are interested in the behavior of the marginal of the posterior with respect to the parameter of interest. A desirable convergence result is the so-called Bernstein-von Mises theorem, which asserts that the marginal posterior asymptotically converges to a normal distribution centered at an efficient estimator.

We will discuss a set of sufficient conditions to obtain this result, mostly focusing on the case where the prior on the nonparametric part is a Gaussian process. An important role is played by the way the model is approximated through the Reproducing Kernel Hilbert space of the prior. We illustrate the result on a few examples, including Cox proportional hazards model and a problem of alignment of curves. In particular, we will see that not all reasonable-looking nonparametric priors lead to good semiparametric properties.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2019, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity