University of Cambridge > > Machine Learning @ CUED > A new MCMC hybrid scheme for Poisson-Kingman Bayesian Nonparametric mixture models

A new MCMC hybrid scheme for Poisson-Kingman Bayesian Nonparametric mixture models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact dsu21.

According to Ghahramani, models that have a nonparametric component give us more flexibility that could lead to better predictive performance. This is because their capacity to learn does not saturate hence their predictions should continue to improve as we get more and more data. Furthermore, we are able to fully consider our uncertainty about predictions thanks to the Bayesian paradigm. However, a major impediment to the widespread use of Bayesian nonparametric models is the problem of inference. Over the years, many Markov chain Monte Carlo methods have been proposed to perform inference which usually rely on a tailored representation of the underlying process. This is an active research area since dealing with this infinite dimensional component forbids the direct use of standard simulation-based methods for posterior inference. Existing methods usually require a finite-dimensional representation and there are two main sampling approaches to facilitate simulation in the case of Bayesian nonparametric mixture models: random truncation and marginalization. These two schemes are known in the literature as conditional and marginal samplers.

In this talk, I will review existing schemes and introduce a novel MCMC scheme for posterior sampling in Bayesian nonparametric mixture models with priors that belong to the general Poisson-Kingman class. This general scheme relies on a new compact way of representing the infinite dimensional component of the model such that while explicitly representing this infinite component it has less memory and storage requirements than previous MCMC schemes. Furthermore, in the flavour of probabilistic programming, we view our contribution as a step towards wider usage of flexible Bayesian nonparametric models, as it allows automated inference in probabilistic programs built out of a wide variety of Bayesian nonparametric building blocks. I will present some comparative simulation results demonstrating the efficacy of the proposed MCMC algorithm against existing marginal and conditional MCMC samplers for the Ļƒ-Stable Poisson-Kingman subclass.

Joint work with Yee Whye Teh and Stefano Favaro.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity