BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Double-descent curves in neural networks: A new perspective using 
 Gaussian processes. - Ouns El Harzli
DTSTART:20210601T140000Z
DTEND:20210601T150000Z
UID:TALK160825@talks.cam.ac.uk
CONTACT:96082
DESCRIPTION:Double-descent curves in neural networks describe the phenomen
 on that the generalisation error initially descends with increasing parame
 ters\, then grows after reaching an optimal number of parameters which is 
 less than the number of data points\, but then descends again in the overp
 arameterised regime.  Here we use a neural network Gaussian process (NNGP)
  which maps exactly to a fully connected network (FCN) in the infinite wid
 th limit\, combined with techniques from random matrix theory\, to calcula
 te this generalisation behaviour\, with a particular focus on the overpara
 meterised regime.  We verify our predictions with numerical simulations of
  the corresponding Gaussian process regressions.  An advantage of our NNGP
  approach is that the analytical calculations are easier to interpret.  We
  argue that neural network generalization performance improves in the over
 parameterised regime precisely because that is where they converge to thei
 r equivalent Gaussian process.
LOCATION:https://cl-cam-ac-uk.zoom.us/j/96785077311?pwd=akljc1JQVS81R1Fxel
 ZCMUdQR3I2dz09
END:VEVENT
END:VCALENDAR
