BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Dropout as a Structured Shrinkage Prior - Eric Nalisnick (Universi
 ty of Cambridge)
DTSTART:20200228T173000Z
DTEND:20200228T190000Z
UID:TALK140113@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:Dropout regularization of deep neural networks has been a myst
 erious yet effective tool to prevent overfitting. Explanations for its suc
 cess range from the prevention of "co-adapted" weights to it being a form 
 of cheap Bayesian inference. We propose a novel framework for understandin
 g multiplicative noise in neural networks\, considering continuous distrib
 utions as well as Bernoulli noise (i.e. dropout). We show that multiplicat
 ive noise induces structured shrinkage priors on a network’s weights. We
  derive the equivalence through reparametrization properties of scale mixt
 ures and without invoking any approximations.  We leverage these insights 
 to propose a novel shrinkage framework for resnets\, terming the prior "au
 tomatic depth determination" as it is the natural analog of "automatic rel
 evance determination" for network depth. 
LOCATION:Auditorium\, Microsoft Research Ltd\, 21 Station Road\, Cambridge
 \, CB1 2FB
END:VEVENT
END:VCALENDAR
