University of Cambridge > > Machine Learning @ CUED > Nonparametric Bayesian Learning of Switching Dynamical Systems

Nonparametric Bayesian Learning of Switching Dynamical Systems

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

The hierarchical Dirichlet process hidden Markov model (HDP-HMM) is a flexible, nonparametric model which allows state spaces of unknown size to be learned from data. In this talk, we demonstrate some limitations of the original HDP -HMM formulation, and propose a sticky extension which allows more robust learning of smoothly varying dynamics. Using DP mixtures, this formulation also allows learning of more complex, multimodal emission distributions. Although the HDP -HMM and its sticky extension are very flexible time series models, they do make a strong Markovian assumption that observations are conditionally independent given the state. This assumption is often insufficient for capturing the temporal dependencies of the observations in real data. To address this issue, we develop two extensions of the sticky HDP -HMM for learning switching dynamical processes: the switching linear dynamical system (SLDS) and the switching vector autoregressive (VAR) process.

We develop a sampling algorithm that combines a truncated approximation to the Dirichlet process with an efficient joint sampling of the mode and state sequences. The utility and flexibility of our models are demonstrated on synthetic data, the NIST speaker diarization database, sequences of dancing honey bees, and the IBOVESPA stock index.

Joint work with Erik Sudderth, Michael Jordan, and Alan Willsky.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity