BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Computational Neuroscience Journal Club - Yashar Ahmadian and Wayn
 e Soo
DTSTART:20210615T140000Z
DTEND:20210615T153000Z
UID:TALK161038@talks.cam.ac.uk
CONTACT:Jake Stroud
DESCRIPTION:Please join us for our fortnightly journal club online via zoo
 m where two presenters will jointly present a topic together. The next top
 ic is ‘Low rank RNNs’ presented by Yashar Ahmadian and Wayne Soo.\n\nZ
 oom information: https://us02web.zoom.us/j/84958321096?pwd=dFpsYnpJYWVNeHl
 JbEFKbW1OTzFiQT09 Meeting ID: 841 9788 6178 Passcode: 659046\n\nBiological
  neural networks have connectivity characterised by ordered structure alon
 gside disorder or heterogeneity that cannot be accounted for by known neur
 al features. The structured part of connectivity often takes the form of a
  low-rank matrix\, while heterogeneity is often modeled by a random matrix
  with independent elements. On the other hand\, tasks of low complexity ca
 n be implemented by recurrent neural networks (RNN) which exhibit low-dime
 nsional dynamics\, and influential paradigms for training RNNs in such tas
 ks (e.g. FORCE learning and reservoir computing) by construction yield con
 nectivity that is a sum of a random matrix and a low-rank one. The computa
 tional benefits of the random component of connectivity (which by itself c
 an lead to chaotic dynamics) during learning or for task-performance is no
 t clear. \n\nThe first two papers that we will present link the low-rank c
 omponent of connectivity to low-dimensional dynamics\, and use dynamic mea
 n-field theory to systematically map the phase diagram of networks with su
 ch connectivity\, when the two components are statistically independent vs
 . correlated\, respectively. The third paper studies gradient based traini
 ng of unrestricted RNNs with random initial connectivity in common neurosc
 ience tasks\, and shows that the resulting change in connectivity is low-r
 ank. Moreover\, they find a clear benefit for the random initial connectiv
 ity in speeding up training\, and they provide theoretical insights for th
 is finding by analytically studying learning in linear RNNs. \n\n1) Mastro
 giuseppe\, F and Ostojic\, S\, Linking Connectivity\, Dynamics\, and Compu
 tations in Low-Rank Recurrent Neural Networks\, Neuron 99\, 609–623\, Au
 gust 8\, 2018.     https://www.sciencedirect.com/science/article/pii/S0896
 627318305439\n\n2) Schuessler\, F et al.\, Dynamics of random recurrent ne
 tworks with correlated low-rank structure\, PHYSICAL REVIEW RESEARCH 2\, 0
 13111 (2020).     https://journals.aps.org/prresearch/abstract/10.1103/Phy
 sRevResearch.2.013111\n\n3) Schuessler\, F et al.\, The interplay between 
 randomness and structure during learning in RNNs\, NeurIPS 2020.\nhttps://
 arxiv.org/abs/2006.11036
LOCATION:Online on Zoom
END:VEVENT
END:VCALENDAR
