University of Cambridge > Talks.cam > Machine Learning @ CUED > Inference for infinite mixture models and Gaussian Process mixtures of experts using simple approximate MAP Inference

Inference for infinite mixture models and Gaussian Process mixtures of experts using simple approximate MAP Inference

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact sara.wade.

The Dirichlet process mixture (DPM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibb’s sampling are required. As a result, DPM -based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. We develop simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithms for DPMs. This algorithm is as simple as K-means clustering, performs in experiments as well as Gibb’s sampling, while requiring only a fraction of the computational effort. Finally, we demonstrate how this approach can be used to perform inference for infinite mixtures of Gaussian Process experts.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity