Progress Towards Understanding Generalization in Deep Learning
- ๐ค Speaker: Gintare Karolina Dziugaite, Element AI ๐ Website
- ๐ Date & Time: Tuesday 13 April 2021, 15:00 - 16:00
- ๐ Venue: https://us02web.zoom.us/j/83327945534?pwd=SlRTcjVDTVFsNXRNdEczbysvSkpPdz09
Abstract
There is, as yet, no satisfying theory explaining why common learning algorithms, like those based on stochastic gradient descent, generalize in practice on overparameterized neural networks. I will discuss various approaches that have been taken to explaining generalization in deep learning, and identify some of the barriers these approaches faced. I will then discuss my recent work on information-theoretic and PAC Bayesian approaches to understanding generalization in noisy variants of SGD . In particular, I will highlight how we can take advantage of conditioning to obtain sharper data and distribution-dependent generalization measures. I will also briefly touch upon my work on properties of the optimization landscape and some of the challenges we face incorporating these insights into the theory of generalization.
Series This talk is part of the ML@CL Seminar Series series.
Included in Lists
- Hanchen DaDaDash
- https://us02web.zoom.us/j/83327945534?pwd=SlRTcjVDTVFsNXRNdEczbysvSkpPdz09
- ML@CL Seminar Series
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)



Tuesday 13 April 2021, 15:00-16:00