BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Nonparametric Bayesian times series models: infini
te HMMs and beyond - Ghahramani\, Z (Cambridge)
DTSTART;TZID=Europe/London:20080620T161000
DTEND;TZID=Europe/London:20080620T171000
UID:TALK12480AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/12480
DESCRIPTION:Hidden Markov models (HMMs) are one of the most wi
dely used statistical models for time series. Trad
itionally\, HMMs have a known structure with a fix
ed number of states and are trained using maximum
likelihood techniques. The infinite HMM (iHMM) all
ows a potentially unbounded number of hidden state
s\, letting the model use as many states as it nee
ds for the data (Beal\, Ghahramani and Rasmussen 2
002). Teh\, Jordan\, Beal and Blei (2006) showed t
hat a form of the iHMM could be derived from the H
ierarchical Dirichlet Process\, and described a Gi
bbs sampling algorithm based on this for the iHMM.
I will talk about recent work we have done on inf
inite HMMs. In\nparticular: we now have a much mor
e efficient inference algorithm based on dynamic p
rogramming\, called 'Beam Sampling'\, which should
make it possible to apply iHMMs to larger problem
s. We have also developed a factorial version of t
he iHMM which makes it possible to have an unbound
ed number of binary state variables\, and can be t
hought of as a time-series generalization of the I
ndian buffet process.\n\nJoint work with Jurgen va
n Gael (Cambridge)\, Yunus Saatci (Cambridge) and
Yee Whye Teh (Gatsby Unit\, UCL).\n
LOCATION:Seminar Room 1\, Newton Institute
CONTACT:Mustapha Amrani
END:VEVENT
END:VCALENDAR