Physics in (Federated) Deep Neural Networks and Beyond: A Parametric Perspective
- đ¤ Speaker: Zexi Li-Zhejiang University
- đ Date & Time: Thursday 14 November 2024, 14:00 - 15:00
- đ Venue: Computer Lab, LT1
Abstract
Physics is about the mechanisms behind our physical world. For Physics in Deep Learning, we try to understand the mechanisms behind the deep learning phenomena and how to build up effective methods based on such understanding. This talk mainly focuses on the model parameters, the behind insights, and the algorithms that can be built upon these insights, which include the issues that 1) how the learning dynamics and weight norm landscape emerge in federated deep learning, 2) how data heterogeneity affects parameter drifts and the relation to neural collapse, and 3) how to locate, edit, and inject knowledge in LLM parameters under a continual manner.
Bio: Zexi Li is a visiting PhD student at CaMLSys Lab, University of Cambridge, and he is a PhD student of Artificial Intelligence in Zhejiang University, China. He focuses on optimization, generalization, and personalization of deep learning models, especially under federated/collaborative setups, through the lens of mechanistic interpretability and learning dynamics. He has published 8 (co)first-author top-tier machine learning papers, including ICML , ICCV, NeurIPS, Patterns (Cell Press), and etc. Personal website: zexilee.github.io/about-zexili.
Series This talk is part of the Cambridge ML Systems Seminar Series series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge talks
- Computer Lab, LT1
- Department of Computer Science and Technology talks and seminars
- Interested Talks
- School of Technology
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 14 November 2024, 14:00-15:00