University of Cambridge > Talks.cam > Machine Learning @ CUED > Novel MCMC and SMC schemes for Poisson-Kingman Bayesian Nonparametric mixture models

Novel MCMC and SMC schemes for Poisson-Kingman Bayesian Nonparametric mixture models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Louise Segar.

According to Ghahramani, models that have a nonparametric component give more flexibility that could lead to better predictive performance. This is because their capacity to learn does not saturate hence their predictions should continue to improve as we get more and more data. Furthermore, uncertainty about predictions can be fully considered thanks to the Bayesian paradigm. However, a major impediment to the widespread use of Bayesian nonparametric models is the problem of inference. Over the years, many Markov chain Monte Carlo (MCMC) methods have been proposed but they have the shortcoming of not being general purpose. These usually rely on a tailored representation of the underlying process. This is an active research area because dealing with the infinite dimensional component forbids the direct use of standard simulation-based methods. Existing methods require a finite-dimensional representation and there are two main sampling approaches to facilitate simulation: random truncation and marginalization. These two schemes are known in the literature as conditional and marginal samplers.

In this talk, I will review existing inference schemes and introduce a novel MCMC scheme for posterior sampling in Bayesian nonparametric mixture models with priors that belong to the general Poisson-Kingman class. This general scheme relies on a characterization that was derived from the size-biased sampling generative process for Poisson-Kingman priors. It leads to new compact way of representing the infinite dimensional component of the model such that while explicitly representing this component it has less memory and storage requirements than previous MCMC schemes. I will present some comparative simulation results demonstrating the efficacy of the proposed MCMC algorithm against existing marginal and conditional MCMC samplers for the σ-Stable Poisson-Kingman subclass. Surprisingly, the size-biased sampling characterization can also be used to build a Sequential Monte Carlo (SMC) sampler which allows to perform inference in a sequential scenario. I will briefly introduce our SMC scheme and present its computational perfomance.

In the flavour of probabilistic programming, we view our contributions as a step towards wider usage of flexible Bayesian nonparametric models, as it allows automated inference in probabilistic programs built out of a wide variety of Bayesian nonparametric building blocks.

Main references

Lomeli, M., Favaro, S., Teh, Y.W., 2015, ’’A hybrid sampler for Poisson-Kingman mixture models’’, Neural information Processing Systems.

Lomeli, M., Favaro, S., Jacob, P. E. and Teh, Y. W. “An SMC sampler por Gibbs-type infinite and finite mixture models”, In preparation.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2017 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity