BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Representation\, optimization and generalization properties of dee
 p neural networks - Peter Bartlett (University of California\, Berkeley)
DTSTART:20180627T104500Z
DTEND:20180627T113000Z
UID:TALK107434@talks.cam.ac.uk
CONTACT:INI IT
DESCRIPTION:Deep neural networks have improved the state-of-the-art perfor
 mance for prediction problems across an impressive range of application ar
 eas. This talk describes some recent results in three directions. First\, 
 we investigate the impact of depth on representational properties of deep 
 residual networks\, which compute near-identity maps at each layer\, showi
 ng how their representational power improves with depth and that the funct
 ional optimization landscape has the desirable property that stationary po
 ints are optimal. Second\, we study the implications for optimization in d
 eep linear networks\, showing how the success of a family of gradient desc
 ent algorithms that regularize towards the identity function depends on a 
 positivity condition of the regression function. Third\, we consider how t
 he performance of deep networks on training data compares to their predict
 ive accuracy\, we demonstrate deviation bounds that scale with a certain "
 spectral complexity\," and we compare the behavior of these bounds with th
 e observed performance of these networks in practical problems.<span><br><
 br>Joint work with Steve Evans\, Dylan Foster\, Dave Helmbold\, Phil Long\
 , and Matus Telgarsky.</span><br>
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
