BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Information-Theoretic Generalization Bounds for Stochastic Gradien
 t Descent - Gergely Neu
DTSTART:20210420T140000Z
DTEND:20210420T150000Z
UID:TALK159361@talks.cam.ac.uk
CONTACT:96082
DESCRIPTION:We study the generalization properties of the popular stochast
 ic gradient descent method for optimizing general non-convex loss function
 s. Our main contribution is providing upper bounds on the generalization e
 rror that depend on local statistics of the stochastic gradients evaluated
  along the path of iterates calculated by SGD. The key factors our bounds 
 depend on are the variance of the gradients (with respect to the data dist
 ribution) and the local smoothness of the objective function along the SGD
  path\, and the sensitivity of the loss function to perturbations to the f
 inal output. Our key technical tool is combining the information-theoretic
  generalization bounds previously used for analyzing randomized variants o
 f SGD with a perturbation analysis of the iterates.
LOCATION:https://cl-cam-ac-uk.zoom.us/j/97019034279?pwd=UWNGQ2lhSzZGK09TdX
 VkSHBGRUFZdz09
END:VEVENT
END:VCALENDAR
