COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Machine Learning @ CUED > A new MCMC hybrid scheme for Poisson-Kingman Bayesian Nonparametric mixture models
A new MCMC hybrid scheme for Poisson-Kingman Bayesian Nonparametric mixture modelsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact dsu21. According to Ghahramani, models that have a nonparametric component give us more flexibility that could lead to better predictive performance. This is because their capacity to learn does not saturate hence their predictions should continue to improve as we get more and more data. Furthermore, we are able to fully consider our uncertainty about predictions thanks to the Bayesian paradigm. However, a major impediment to the widespread use of Bayesian nonparametric models is the problem of inference. Over the years, many Markov chain Monte Carlo methods have been proposed to perform inference which usually rely on a tailored representation of the underlying process. This is an active research area since dealing with this infinite dimensional component forbids the direct use of standard simulation-based methods for posterior inference. Existing methods usually require a finite-dimensional representation and there are two main sampling approaches to facilitate simulation in the case of Bayesian nonparametric mixture models: random truncation and marginalization. These two schemes are known in the literature as conditional and marginal samplers. In this talk, I will review existing schemes and introduce a novel MCMC scheme for posterior sampling in Bayesian nonparametric mixture models with priors that belong to the general Poisson-Kingman class. This general scheme relies on a new compact way of representing the infinite dimensional component of the model such that while explicitly representing this infinite component it has less memory and storage requirements than previous MCMC schemes. Furthermore, in the flavour of probabilistic programming, we view our contribution as a step towards wider usage of flexible Bayesian nonparametric models, as it allows automated inference in probabilistic programs built out of a wide variety of Bayesian nonparametric building blocks. I will present some comparative simulation results demonstrating the efficacy of the proposed MCMC algorithm against existing marginal and conditional MCMC samplers for the Ļ-Stable Poisson-Kingman subclass. Joint work with Yee Whye Teh and Stefano Favaro. This talk is part of the Machine Learning @ CUED series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsVisual Rhetoric and modern South Asian history (2014) ETECH Projects Collaboration Skills Initiative Physics of Medicine Journal Club ARC (Anglia Ruskin ā Cambridge) Romance Linguistics SeminarsOther talksAn investigation into hepatocyte expression and prognostic significance of senescence marker p21 in canine chronic hepatitis Foster Talk - CANCELLED - Redox Oscillations in the Circadian Clockwork The homelands of the plague: Soviet disease ecology in Central Asia, 1920sā1950s Epigenetics: One Genome, Multiple Phenotypes Propaganda porcelain: The mirror of the Russian revolution and its consequences |