BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Short talks: Mixed Cumulative Distribution Network
s\; Nonparametric Bayesian community discovery in
social networks\; Expectation Propagation for Diri
chlet Process Mixture Models - Charles Blundell\,
Lloyd Elliot and Vinayak Rao\, Gatsby Unit\, UCL
DTSTART;TZID=Europe/London:20101105T133000
DTEND;TZID=Europe/London:20101105T150000
UID:TALK27812AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/27812
DESCRIPTION:Three short talks by PhD students from the Gatsby
Unit\, UCL.\n\nCharles Blundell: Mixed Cumulative
Distribution Networks (Ricardo\nSilva\, Charles Bl
undell and Yee Whye Teh)\n\nAcyclic directed mixed
graphs (ADMGs) are generalizations of DAGs that c
an\nsuccinctly capture much richer sets of conditi
onal independencies\, and are\nespecially useful i
n modeling the effects of latent variables implici
tly.\nUnfortunately\, there are currently no param
eterizations of general ADMGs. In\nthis work we ap
ply recent work on cumulative distribution network
s and copulas\nto propose one general construction
for ADMG models.\n\n\nLloyd Elliot: Nonparametric
Bayesian community discovery in social\nnetworks
(Lloyd Elliott and Yee Whye Teh)\n\nWe introduce a
novel prior on random graphs using a beta process
. The\natoms of the beta process represent commun
ities and the edges of the\ngraph are independent
given the latent community structure. We use\nMCM
C sampling methods to infer the community structur
e and to impute\nmissing links in large social net
work data sets. We use split-merge\nupdates to in
crease the effective sample size of the MCMC chain
and\nimprove the predictive probabilities.\n\n\nV
inayak Rao: Expectation Propagation for Dirichlet
Process Mixture Models (with\nErik Sudderth and Ye
e Whye Teh)\n\nWe explore Expectation Propagation
for approximate inference in the DP\nmixture model
. By considering three related representations of
the DP\n(based on the Polya urn and Chinese restau
rant process)\, we derive\nthree different EP appr
oximation algorithms. The simplest of these is\nth
e approximation studied in Minka and Ghahramani (2
003). While this\ndoes not represent information
about the posterior clustering\nstructure\, the ot
her two novel approaches include additional latent
\nvariables to capture this clustering structure a
nd offer richer\nposterior representations. We als
o elaborate on improvements to the\nbasic EP algor
ithms: reducing computational costs by removing lo
w\nprobability components\, and learning the hyper
parameters of the DP\nmixture model.
LOCATION:Engineering Department\, CBL Room 438
CONTACT:Sinead Williamson
END:VEVENT
END:VCALENDAR