Human Behavior Classification with Infinite Hidden Conditional Random Fields
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Zoubin Ghahramani.
Hidden Conditional Random Fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). The Infinite Hidden Conditional Random Field (IHCRF) is a Hierarchical Dirichlet Process-Hidden Conditional Random Field with a countably infinite number of hidden states, which rids us not only of the necessity to specify a priori a fixed number of hidden states available to the latent variables of the model but also of the problem of overfitting. In this talk, we will present the model and two approaches to learning it: an effective Markov chain Monte Carlo (MCMC) sampling technique, and a novel variational approach. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCR Fs chosen via cross-validation for the difficult tasks of recognizing instances of spontaneous agreement, disagreement, and pain. We assume the audience has a basic understanding of Dirichlet Processes.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|