|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Bayesian Nonparametrics: Latent Feature and Prediction Models, and Efficient Inference
If you have a question about this talk, please contact Zoubin Ghahramani.
Nonparametric Bayesian approaches offer a flexible modeling paradigm for data without limiting the model-complexity a priori. The flexibility comes from the fact that the model-complexity can grow adaptively with data. The Indian Buffet Process (IBP) is an example of a nonparametric Bayesian model in which a set of observations are assumed to be generated from a small set of latent features, and the number of latent features need not be known a priori. In this talk, I will describe some of my recent work on the IBP based models; in particular, (1) A variant of the IBP which removes the independent latent features assumption, and allows the latent features to be related via a hierarchy, (2) A nonparametric Bayesian multitask learning model which uses a combination of the Dirichlet Process mixture model and the IBP as the prior distribution on the weight vectors of multiple tasks, and (3) An efficient, search-based inference method for finding an approximate MAP estimate of the latent feature assignment matrix in the IBP based models.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsJustice and Communities Research Unit, Anglia Ruskin University Faculty of Music Colloquia Inference Group Journal Clubs
Other talksChallenges In Silicon-Based PV Technologies Travels among Australasian alpines From theory to practice: A tactile and digital approach to daylighting in architecture Polymers for nanotechnology Is there anyone out there who really is interested in the speaker? Art Speak