BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Computational Neuroscience Journal Club - Rui Xia\, Edward Young
DTSTART:20231024T130000Z
DTEND:20231024T140000Z
UID:TALK207523@talks.cam.ac.uk
CONTACT:Puria Radmard
DESCRIPTION:Please join us for our fortnightly Computational Neuroscience 
 journal club on Tuesday 24th October at 2pm UK time in the CBL seminar roo
 m\, or online on zoom.\n\nThe title is ‘Dynamics of Learning in Deep Lin
 ear Networks’\, presented by Rui Xia and Edward Young.\n\nZoom informati
 on: https://eng-cam.zoom.us/j/84204498431?pwd=Um1oU284b1YxWThObGw4ZU9XZitW
 dz09 Meeting ID: 842 0449 8431 Passcode: 684140\n\nDeep neural networks wo
 rk incredibly well in a range of applications. Theoretical understanding o
 f their learning dynamics and other mysteries is required. One such myster
 y in deep learning is the generalization ability\, even for overparameteri
 zed models. A view by which gradient-based optimization induces an implici
 t regularization - a bias towards models of low complexity - has arisen. W
 e will first present [1]\, which studies the implicit regularization of gr
 adient descent over deep linear neural networks such that adding depth to 
 a matrix factorization enhances an implicit tendency towards low-rank solu
 tions. We will then cover [2] which derives analytically closed form expre
 ssions for learning dynamics in such deep linear neural networks. Time per
 mitting\, we will examine [3]\, which applies this theory to the developme
 nt of semantic categories.\n\n[1] Arora\, S.\, Cohen\, N.\, Hu\, W.\, & Lu
 o\, Y. (2019). Implicit regularization in deep matrix factorization. Advan
 ces in Neural Information Processing Systems\, 32.\n[2] Saxe\, A. M.\, McC
 lelland\, J. L.\, & Ganguli\, S. (2013). Exact solutions to the nonlinear 
 dynamics of learning in deep linear neural networks. arXiv preprint arXiv:
 1312.6120.\n[3] Saxe\, A. M.\, McClelland\, J. L.\, & Ganguli\, S. (2019).
  A mathematical theory of semantic development in deep neural networks. Pr
 oceedings of the National Academy of Sciences\, 116(23)\, 11537-11546.
LOCATION:CBL Seminar Room\, Engineering Department\, 4th floor Baker build
 ing
END:VEVENT
END:VCALENDAR
