BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Probabilistic synapses - Laurence Aitchison
DTSTART:20160212T120000Z
DTEND:20160212T130000Z
UID:TALK64511@talks.cam.ac.uk
CONTACT:Prof Máté Lengyel
DESCRIPTION:Organisms face a hard problem: based on noisy sensory input\, 
 they must set a large number of synaptic weights. However\, they do not re
 ceive enough information in their lifetime to learn the correct\, or optim
 al weights (i.e. the weights that ensure the circuit\, system\, and ultima
 tely organism function as effectively as possible). Instead\, the best the
 y could possibly do is compute a probability distribution over the optimal
  weights. Based on this observation\, we hypothesize that synapses represe
 nt probability distribution over weights — in contrast to the widely hel
 d belief that they represent point estimates. From this hypothesis\, we de
 rive learning rules for supervised\, reinforcement and unsupervised learni
 ng. This introduces a new feature: the more uncertain the synapse is about
  its weight\, the more plastic it is. This makes intuitive sense: if the u
 ncertainty about a weight is large\, new data should strongly influence it
 s value\, while if the uncertainty is small\, little learning is needed.  
 This hypothesis makes two predictions about how learning rates should vary
  across synapses and across time. We also introduce a second hypothesis\, 
 which is that the more uncertainty there is about a synaptic weight\, the 
 more variable it is. More concretely\, the PSP amplitude at a given time i
 s a sample from the probability distribution describing the synapse's unce
 rtainty.  This hypothesis makes several predictions\, and we present data 
 for one: that variability should increase as the presynaptic firing rate f
 alls.
LOCATION:Cambridge University Engineering Department\, CBL\, BE-438 (http:
 //learning.eng.cam.ac.uk/Public/Directions)
END:VEVENT
END:VCALENDAR
