BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Novel MCMC and SMC schemes for Poisson-Kingman Bay
esian Nonparametric mixture models - Maria Lomeli
(University of Cambridge)
DTSTART;TZID=Europe/London:20160602T103000
DTEND;TZID=Europe/London:20160602T113000
UID:TALK66448AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/66448
DESCRIPTION:According to Ghahramani\, models that have a nonpa
rametric component give more flexibility that coul
d lead to better predictive performance. This is b
ecause their capacity to learn does not saturate h
ence their predictions should continue to improve
as we get more and more data. Furthermore\, uncert
ainty about predictions can be fully considered th
anks to the Bayesian paradigm. However\, a major i
mpediment to the widespread use of Bayesian nonpar
ametric models is the problem of inference. Over t
he years\, many Markov chain Monte Carlo (MCMC) me
thods have been proposed but they have the shortco
ming of not being general purpose. These usually r
ely on a tailored representation of the underlying
process. This is an active research area because
dealing with the infinite dimensional component fo
rbids the direct use of standard simulation-based
methods. Existing methods require a finite-dimensi
onal representation and there are two main samplin
g approaches to facilitate simulation: random trun
cation and marginalization. These two schemes are
known in the literature as conditional and margina
l samplers.\n\nIn this talk\, I will review existi
ng inference schemes and introduce a novel MCMC sc
heme for posterior sampling in Bayesian nonparamet
ric mixture models with priors that belong to the
general Poisson-Kingman class. This general scheme
relies on a characterization that was derived fro
m the size-biased sampling generative process for
Poisson-Kingman priors. It leads to new compact wa
y of representing the infinite dimensional compone
nt of the model such that while explicitly represe
nting this component it has less memory and storag
e requirements than previous MCMC schemes. I will
present some comparative simulation results demons
trating the efficacy of the proposed MCMC algorith
m against existing marginal and conditional MCMC s
amplers for the σ-Stable Poisson-Kingman subclass.
Surprisingly\, the size-biased sampling character
ization can also be used to build a Sequential Mon
te Carlo (SMC) sampler which allows to perform inf
erence in a sequential scenario. I will briefly in
troduce our SMC scheme and present its computation
al perfomance.\n\nIn the flavour of probabilistic
programming\, we view our contributions as a step
towards wider usage of flexible Bayesian nonparame
tric models\, as it allows automated inference in
probabilistic programs built out of a wide variety
of Bayesian nonparametric building blocks.\n\nMai
n references\n\nLomeli\, M.\, Favaro\, S.\, Teh\,
Y.W.\, 2015\, ''A hybrid sampler for Poisson-Kingm
an mixture models''\, Neural information Processin
g Systems. \n\nLomeli\, M.\, Favaro\, S.\, Jacob\,
P. E. and Teh\, Y. W. "An SMC sampler por Gibbs-t
ype infinite and finite mixture models"\, In prepa
ration.
LOCATION:Engineering Department\, CBL Room BE-438
CONTACT:Louise Segar
END:VEVENT
END:VCALENDAR