University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Learning Directed Acyclic Graphs (DAGs) With Continuous Optimization

Learning Directed Acyclic Graphs (DAGs) With Continuous Optimization

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Isaac Reid.

Zoom link available upon request (it is sent out on our mailing list, eng-mlg-rcc [at] lists.cam.ac.uk). Sign up to our mailing list for easier reminders via lists.cam.ac.uk.

Estimating the structure of directed acyclic graphs (DAGs) is a challenging problem since the search space of DAGs is combinatorial and scales super-exponentially with the number of nodes. Traditional approaches rely on various local heuristics for enforcing the acyclicity constraint.

Recent advancements have introduced a fundamentally different strategy that formulates DAG learning as a purely continuous optimisation problem over real matrices. This is achieved by capitalising on innovative, differentiable acyclicity characterization functions of DAGs. By eliminating the need for combinatorial constraints, it offers efficient solutions through standard numerical algorithms. Notably, this strategy exhibits several advantages, including the detection of large cycles, improved gradient behaviour, and faster runtime performance.

This talk will introduce a few representative acyclicity characterisation, e.g. trace of matrix exponential function proposed in the No-Tears paper (which is based on the idea that powers of an adjacency matrix contain information about walks and cycles), and log-determinant (log-det) function introduced in the DAGMA paper (which leverages the nilpotency property of DAGs and the property of M-matrices.) These works open possibilities for more effective and efficient DAG learning.

Reading suggestions: Zheng, Xun, Bryon Aragam, Pradeep K. Ravikumar, and Eric P. Xing. “Dags with no tears: Continuous optimization for structure learning.” Advances in neural information processing systems 31 (2018). Bello, Kevin, Bryon Aragam, and Pradeep Ravikumar. “Dagma: Learning dags via m-matrices and a log-determinant acyclicity characterization.” Advances in Neural Information Processing Systems 35 (2022): 8226-8239.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity