BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:A three-threshold learning rule approaches the maximal capacity of
  recurrent neural networks - Alireza Alemi\, Politecnico di Torino (Italy)
DTSTART:20150624T103000Z
DTEND:20150624T110000Z
UID:TALK59981@talks.cam.ac.uk
CONTACT:Guillaume Hennequin
DESCRIPTION:Understanding the theoretical foundations of how memories are 
 encoded and retrieved in neural populations is a central challenge in neur
 oscience. A popular theoretical scenario for modeling memory function is t
 he attractor neural network scenario\, whose prototype is the Hopfield mod
 el. The model simplicity and the locality of the synaptic update rules com
 e at the cost of a poor storage capacity\, compared with the capacity achi
 eved with perceptron learning algorithms.  Here\, by transforming the perc
 eptron learning rule\, we present an on-line learning rule for a recurrent
  neural network that achieves near-maximal storage capacity without an exp
 licit supervisory error signal\, relying only upon locally accessible info
 rmation. The fully-connected network consists of excitatory binary neurons
  with plastic recurrent connections and non-plastic inhibitory feedback st
 abilizing the network dynamics\; the memory patterns to be memorized are p
 resented on-line as strong afferent currents\, producing a bimodal distrib
 ution for the neuron synaptic inputs. Synapses corresponding to active inp
 uts are modified as a function of the value of the local fields with respe
 ct to three thresholds.  Above the highest threshold\, and below the lowes
 t threshold\, no plasticity occurs. In between these two thresholds\, pote
 ntiation/depression occurs when the local field is above/below an intermed
 iate threshold.  We simulated and analyzed a network of binary neurons imp
 lementing this rule and measured its storage capacity for different sizes 
 of the basins of attraction. The storage capacity obtained through numeric
 al simulations is shown to be close to the value predicted by analytical c
 alculations. We also measured the dependence of capacity on the strength o
 f external inputs. Finally\, we quantified the statistics of the resulting
  synaptic connectivity matrix\, and found that both the fraction of zero w
 eight synapses and the degree of symmetry of the weight matrix increase wi
 th the number of stored patterns.
LOCATION:Cambridge University Engineering Department\, CBL\, BE-438 (http:
 //learning.eng.cam.ac.uk/Public/Directions)
END:VEVENT
END:VCALENDAR
