BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:On the Training of Infinitely Deep and Wide ResNets - Gabriel Peyr
 é (École Normale Supérieure)
DTSTART:20230525T140000Z
DTEND:20230525T150000Z
UID:TALK193588@talks.cam.ac.uk
CONTACT:Matthew Colbrook
DESCRIPTION:Overparametrization is a key factor in the absence of convexit
 y to explain global convergence of gradient descent (GD) for neural networ
 ks. Beside the well studied lazy regime\, infinite width (mean field) anal
 ysis has been developed for shallow networks\, using convex optimization t
 echnics. To bridge the gap between the lazy and mean field regimes\, we st
 udy Residual Networks (ResNets) in which the residual block has linear par
 ametrization while still being nonlinear. Such ResNets admit both infinite
  depth and width limits\, encoding residual blocks in a Reproducing Kernel
  Hilbert Space (RKHS). In this limit\, we prove a local Polyak-Lojasiewicz
  inequality. Thus\, every critical point is a global minimizer and a local
  convergence result of GD holds\, retrieving the lazy regime. In contrast 
 with other mean-field studies\, it applies to both parametric and non-para
 metric cases under an expressivity condition on the residuals. Our analysi
 s leads to a practical and quantified recipe: starting from a universal RK
 HS\, Random Fourier Features are applied to obtain a finite dimensional pa
 rameterization satisfying with high-probability our expressivity condition
 .\nThis is a joint work with Raphaël Barboni and François-Xavier Vialard
 .
LOCATION:Centre for Mathematical Sciences\, MR14
END:VEVENT
END:VCALENDAR
