University of Cambridge > Talks.cam > Cambridge ML Systems Seminar Series > Physics in (Federated) Deep Neural Networks and Beyond: A Parametric Perspective

Physics in (Federated) Deep Neural Networks and Beyond: A Parametric Perspective

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Sally Matthews.

Physics is about the mechanisms behind our physical world. For Physics in Deep Learning, we try to understand the mechanisms behind the deep learning phenomena and how to build up effective methods based on such understanding. This talk mainly focuses on the model parameters, the behind insights, and the algorithms that can be built upon these insights, which include the issues that 1) how the learning dynamics and weight norm landscape emerge in federated deep learning, 2) how data heterogeneity affects parameter drifts and the relation to neural collapse, and 3) how to locate, edit, and inject knowledge in LLM parameters under a continual manner.

Bio: Zexi Li is a visiting PhD student at CaMLSys Lab, University of Cambridge, and he is a PhD student of Artificial Intelligence in Zhejiang University, China. He focuses on optimization, generalization, and personalization of deep learning models, especially under federated/collaborative setups, through the lens of mechanistic interpretability and learning dynamics. He has published 8 (co)first-author top-tier machine learning papers, including ICML , ICCV, NeurIPS, Patterns (Cell Press), and etc. Personal website: zexilee.github.io/about-zexili.

This talk is part of the Cambridge ML Systems Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity