BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Understanding the loss landscapes of large neural networks: scalin
 g\, generalization\, and robustness - Stanislav Fort\, Stanford University
DTSTART:20211015T150000Z
DTEND:20211015T160000Z
UID:TALK164239@talks.cam.ac.uk
CONTACT:Pietro Lio
DESCRIPTION:Large deep neural networks trained with gradient descent have 
 been extremely successful at learning solutions to a broad suite of diffic
 ult problems across a wide range of domains. Despite their tremendous succ
 ess\, we still do not have a detailed\, predictive understanding of how th
 ey work and what makes them so effective. In this talk\, I will describe r
 ecent efforts to understand the structure of deep neural network loss land
 scapes and how gradient descent navigates them during training. In particu
 lar\, I will discuss a phenomenological approach to modeling their large-s
 cale structure using high-dimensional geometry [1]\, the role of their non
 linear nature in the early phases of training [2]\, its effects on ensembl
 ing\, calibration\, and approximate Bayesian techniques [3]\, and the ques
 tions of model scaling\, multi-modality\, pre-training and their connectio
 ns to out-of-distribution robustness and generalization [4].\n\n[1] Stanis
 lav Fort\, and Stanislaw Jastrzebski. “Large Scale Structure of Neural N
 etwork Loss Landscapes.” NeurIPS 2019. arXiv 1906.04724\n\n[2] Stanislav
  Fort et al. "Deep learning versus kernel learning: an empirical study of 
 loss landscape geometry and the time evolution of the Neural Tangent Kerne
 l". NeurIPS 2020. arXiv 2010.15110\n\n[3] Stanislav Fort\, Huiyi Hu\, Bala
 ji Lakshminarayanan. "Deep Ensembles: A Loss Landscape Perspective." arXiv
  1912.02757\n\n[4] Stanislav Fort\, Jie Ren\, and Balaji Lakshminarayanan.
  Exploring the Limits of Out-of-Distribution Detection. NeurIPS 2021. arXi
 v 2106.03004\n
LOCATION:Department of Computer Science and technology\, Lecture Theatre 1
END:VEVENT
END:VCALENDAR
