BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Neural Ordinary Differential Equations - Eric T Nalisnick (Univers
 ity of Cambridge)
DTSTART:20181128T140000Z
DTEND:20181128T153000Z
UID:TALK115564@talks.cam.ac.uk
CONTACT:75379
DESCRIPTION:Residual connections have enabled SOTA performance on ImageNet
  and the training of extremely deep neural networks (1000+ layers).  They 
 can be thought of as modeling discrete time changes in the hidden units\, 
 i.e.\nh_l+1 - h_l = F(h_l\, W).  Neural ordinary differential equations (N
 eural ODEs) consider a continuous time representation of ResNets.  That is
 \, the ResNet parametrizes the instantaneous change dh/dt = F(h_l\, t)\,\n
 where time t now plays a role akin to network depth.  How is such a networ
 k useful?  How would model fitting and optimization be done? In this meeti
 ng of the reading group\, we will explain the Neural ODE model and trainin
 g algorithm proposed by Chen et al. (NeurIPS 2018\, Oral).  Additionally\,
  we will give overviews of their applications to time series and generativ
 e modeling.
LOCATION:Engineering Department\, CBL Room 438
END:VEVENT
END:VCALENDAR
