BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Automated Scientific Discovery with ML Review\, Can We Interpret t
 he Quality of NNs without Testing? - Yi Gu (DAMTP PhD)\, Luca Muscarnera (
 DAMTP PhD)
DTSTART:20251118T153000Z
DTEND:20251118T163000Z
UID:TALK240910@talks.cam.ac.uk
CONTACT:Liz Tan
DESCRIPTION:*Topic 1: A Review of Automated Scientific Discovery with ML (
 Yi Gu):* \nAutomated physics discovery using machine learning is an exciti
 ng research direction.  A core validation challenge lies in whether machin
 e learning systems can autonomously reproduce historically famous physics 
 discoveries under the condition of zero prior knowledge. I will review and
  discuss a series of research efforts that validate this feasibility.  We 
 will see the progress achieved by machine learning at multiple key levels\
 , including the formation of concepts\, the discovery of equations\, and t
 he identification of physical laws and symmetries.\n\n*Topic 2: Heavy-Tail
 ed Universality Predicts Trends in Test Accuracies for\nVery Large Pre-Tra
 ined Deep Neural Networks (Luca Muscarnera)*: Understanding whether a Neur
 al Network will generalize or not is crucial in high-stakes applications\,
  and assessing the reliability of such models is\, unfortunately\, a non-t
 rivial challenge. This article explains a mechanistic interpretability-bas
 ed technique to address this problem\, analyzing the statistical propertie
 s of the weight matrices in very large neural networks\, without accessing
  the training data. After developing a suitable theoretical framework\, th
 e authors corroborate their claims by studying the correlation between the
 ir proposed metrics and the test error of large neural networks. In a nuts
 hell\, the article tries to answer a fundamental question: can we interpre
 t the quality of learning in NNs without testing it? Discussing the paper 
 "Heavy-Tailed Universality Predicts Trends in Test Accuracies for Very Lar
 ge Pre-Trained Deep Neural Networks" C. Martin\, M. Mahoney (2020) https:/
 /arxiv.org/abs/1901.08278
LOCATION:Potter Room Pavilion B2\, Centre for Mathematical Sciences\, Wilb
 erforce Rd\, Cambridge CB3 0WA
END:VEVENT
END:VCALENDAR
