BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Reading group: Underpinning techniques of most widely used DNN arc
 hitectures - Nicolai Baldin (Statslab) 
DTSTART:20171031T140000Z
DTEND:20171031T150000Z
UID:TALK93700@talks.cam.ac.uk
CONTACT:Nicolai Baldin
DESCRIPTION:In the first session we will look more closely into common tec
 hniques of widely used NN architectures like batch normalisation\, dropout
  and  stochastic optimisers. We shall also touch upon regularisation ideas
  and various activation functions.\n\nIt will be roughly based upon the fo
 llowing papers:\n\n1. Batch Normalization: Accelerating Deep Network Train
 ing by Reducing Internal Covariate Shift "pdf":https://arxiv.org/pdf/1502.
 03167.pdf\n\n2. Dropout: A Simple Way to Prevent Neural Networks from Over
 fitting "pdf":https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf\n\n
 3. Adam: A Method for Stochastic Optimization "pdf":https://arxiv.org/pdf/
 1412.6980.pdf\n\nThe first session will be given by the organisers but par
 ticipants are expected to be familiar with the papers. More information ab
 out the reading group can be found at "mathsml.com":http://mathsml.com/rea
 ding-group/
LOCATION:Centre for Mathematical Sciences\, MR2
END:VEVENT
END:VCALENDAR
