COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Machine Learning @ CUED > Beam Sampling for Infinite Hidden Markov Models

## Beam Sampling for Infinite Hidden Markov ModelsAdd to your list(s) Download to your calendar using vCal - Jurgen Van Gael
- Wednesday 02 April 2008, 14:00-15:00
- Engineering Department, CBL Room 438.
If you have a question about this talk, please contact Carl Edward Rasmussen. The Infinite Hidden Markov Model (iHMM) [1,2] is an extension of the classical Hidden Markov Model widely used in machine learning and bioinformatics. As a tool to model sequential data, Hidden Markov Models suffer from the need to specify the number of hidden states. Although model selection and model averaging are widely used in this context, the Infinite Hidden Markov Model offers a nonparametric alternative. The core idea of the iHMM is to use Dirichlet Processes to define the distribution of the rows of a Markov Model transition matrix. As such, the number of used states can automatically be adapted during learning; or can be integrated over for prediction. Until now, the Gibbs sampler was the only known inference algorithm for the iHMM. This is unfortunate as the Gibbs sampler is known to be weak for strongly correlated data; which is often the case in sequential or time series data. Moreover, it is suprising that we have powerful inference algorithms for finite HMM ’s (the forward-backward or Baum-Welch dynamic programming algorithms) but cannot apply these methods for the iHMM. In this work, we propose a method called the “Beam Sampler” which combines ideas from slice sampling and dynamic programming for inference in the iHMM. We show that the beam sampler has some interesting properties such as: (1) it is less susceptible to strong correlations in the data than the Gibbs sampler, (2) it can handle non-conjugacy in the model more easily than the Gibbs sampler. We also show that the scope of the beam sampler idea goes beyond training the Infinite Hidden Markov Model, but can also be used to efficiently train finite HMM ’s. This talk is part of the Machine Learning @ CUED series. ## This talk is included in these lists:- Seminar
- All Talks (aka the CURE list)
- Biology
- CBL important
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge Neuroscience Seminars
- Cambridge talks
- Chris Davis' list
- Creating transparent intact animal organs for high-resolution 3D deep-tissue imaging
- Engineering Department, CBL Room 438
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Joint Machine Learning Seminars
- Life Science
- Life Sciences
- ML
- Machine Learning @ CUED
- Machine Learning Summary
- Neuroscience
- Neuroscience Seminars
- Neuroscience Seminars
- Required lists for MLG
- Simon Baker's List
- Stem Cells & Regenerative Medicine
- Trust & Technology Initiative - interesting events
- bld31
- dh539
- ndk22's list
- ob366-ai4er
- rp587
- yk373's list
Note that ex-directory lists are not shown. |
## Other listsGraduate Union talks Cambridge Statistics Clinic Thinking Society: How is understanding possible?## Other talksLunchtime Talk: Helen's Bedroom Questions of Morality in Global Health- An interdisciplinary conference Short-Selling Restrictions and Returns: a Natural Experiment Cohomology of the moduli space of curves A domain-decomposition-based model reduction method for convection-diffusion equations with random coefficients Laser Printed Organic Electronics, Metal-Organic Framework - Polymer Nanofiber Composites for Gas Separation The evolution of photosynthetic efficiency An SU(3) variant of instanton homology for webs Cyclic Peptides: Building Blocks for Supramolecular Designs Katie Field - Symbiotic options for the conquest of land Immigration and Freedom Active Machine Learning: From Theory to Practice |