BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning Reading Group @ CUED
SUMMARY:Learning Directed Acyclic Graphs (DAGs) With Conti
nuous Optimization - Dr Pingfan Song\, University
of Cambridge
DTSTART;TZID=Europe/London:20231108T110000
DTEND;TZID=Europe/London:20231108T123000
UID:TALK208207AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/208207
DESCRIPTION:Estimating the structure of directed acyclic graph
s (DAGs) is a challenging problem since the search
space of DAGs is combinatorial and scales super-e
xponentially with the number of nodes. Traditional
approaches rely on various local heuristics for e
nforcing the acyclicity constraint. \n\nRecent adv
ancements have introduced a fundamentally differen
t strategy that formulates DAG learning as a purel
y continuous optimisation problem over real matric
es. This is achieved by capitalising on innovative
\, differentiable acyclicity characterization func
tions of DAGs. By eliminating the need for combina
torial constraints\, it offers efficient solutions
through standard numerical algorithms. Notably\,
this strategy exhibits several advantages\, includ
ing the detection of large cycles\, improved gradi
ent behaviour\, and faster runtime performance.\n\
nThis talk will introduce a few representative acy
clicity characterisation\, e.g. trace of matrix ex
ponential function proposed in the No-Tears paper
(which is based on the idea that powers of an adja
cency matrix contain information about walks and c
ycles)\, and log-determinant (log-det) function in
troduced in the DAGMA paper (which leverages the n
ilpotency property of DAGs and the property of M-m
atrices.) These works open possibilities for more
effective and efficient DAG learning.\n\nReading s
uggestions:\nZheng\, Xun\, Bryon Aragam\, Pradeep
K. Ravikumar\, and Eric P. Xing. "Dags with no tea
rs: Continuous optimization for structure learning
." Advances in neural information processing syste
ms 31 (2018).\nBello\, Kevin\, Bryon Aragam\, and
Pradeep Ravikumar. "Dagma: Learning dags via m-mat
rices and a log-determinant acyclicity characteriz
ation." Advances in Neural Information Processing
Systems 35 (2022): 8226-8239.\n
LOCATION:Cambridge University Engineering Department\, CBL
Seminar room BE4-38.
CONTACT:Isaac Reid
END:VEVENT
END:VCALENDAR