BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Expectation Propagation\, Experimental Design for
the Sparse Linear Model - Matthias Seeger (Max Pla
nck Institute for Biological Cybernetics)
DTSTART;TZID=Europe/London:20080220T140000
DTEND;TZID=Europe/London:20080220T150000
UID:TALK10729AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/10729
DESCRIPTION:Expectation propagation (EP) is a novel variationa
l method for approximate\nBayesian inference\, whi
ch has given promising results in terms of computa
tional\nefficiency and accuracy in several machine
learning applications. It can readily\nbe applied
to inference in linear models with non-Gaussian p
riors\, generalised\nlinear models\, or nonparamet
ric Gaussian process models\, among others.\nI wil
l give an introduction to this framework. Importan
t\naspects of EP are not well-understood theoretic
ally.\nI will highlight some open problems.\n\nI w
ill then show how to address sequential experiment
al design for a linear model\nwith non-Gaussian sp
arsity priors\, giving some results in two differe
nt machine\nlearning applications. These results i
ndicate that experimental design for these\nmodels
may have significantly different properties than
for linear-Gaussian models\,\nwhere Bayesian infer
ence is analytically tractable and experimental de
sign seems best understood.\n
LOCATION:Engineering Department\, CBL Room 438
CONTACT:Zoubin Ghahramani
END:VEVENT
END:VCALENDAR