University of Cambridge > Talks.cam > Machine Learning @ CUED > Bayesian nonparametric dynamic-clustering and genetic imputation

Bayesian nonparametric dynamic-clustering and genetic imputation

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

I will describe new approaches to dynamic-clustering based on Bayesian nonparametric (BNP) hidden Markov models (HMMs). I will apply these approaches to genotype imputation problems and illustrate the practical benefits of BNP . Genetic similarity within a population is a function of chromosome position and dynamic-clustering based on parametric HMMs are popular models of genetic structure. BNP priors are well suited as extensions of, or competitors to, these HMMs because many aspects of genetic processes (such as allele sampling) arise naturally from BNP models. In addition, BNP priors provide several practical benefits over parametric HMMs. First, by defining probability distributions on the set of partitions, BNP priors avoid label switching problems. Second, costly model selection and ad-hoc methods to determine the number of latent clusters are also avoided. Finally, the flexibility of BNP often provides state-of-the-art imputation accuracy. I will conclude with directions of future work including the abstraction of the auxiliary Gibbs scheme (which I derived for inference) to probabilistic programming for BNP models.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity