BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Demystifying Deep Learning - Rob Nowak (U. Wisconsin)
DTSTART:20220317T170000Z
DTEND:20220317T180000Z
UID:TALK135121@talks.cam.ac.uk
CONTACT:HoD Secretary\, DPMMS
DESCRIPTION:Neural networks have made a startling comeback during the past
  decade\, rebranded as "deep learning." The empirical success of neural ne
 tworks is phenomenal but poorly understood. The state-of-the-art seems to 
 change every few months\, prompting some to call it alchemy and others to 
 suggest that wholly new mathematical approaches are required to understand
  neural networks. Contrary to this\, I argue that deep learning can be und
 erstood by adapting standard nonparametric statistical theory and methods 
 to the neural network setting. Our main result is this: neural networks ar
 e exact solutions to nonparametric estimation problems in "mixed variation
 " function spaces.  The spaces\, characterized by notions of total variati
 on in the Radon (transform) domain\, include multivariate functions that a
 re very smooth in all but a small number of directions. Spatial inhomogene
 ity of this sort leads to a fundamental gap between the performance of neu
 ral networks and linear methods (which include kernel methods)\, explainin
 g why neural networks can outperform classical methods for high-dimensiona
 l function estimation. Our theory provides new insights into the practices
  of  "weight decay\," "overparameterization\," and adding linear connectio
 ns and layers to network architectures. It yields a deeper understanding o
 f the role of sparsity and (avoiding) the curse of dimensionality. And las
 tly\, the theory leads to new and improved neural network architectures an
 d regularization methods.\n\nThis talk is based on joint work with Rahul P
 arhi.\n
LOCATION:Centre for Mathematical Sciences MR2
END:VEVENT
END:VCALENDAR
