BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:The unbreakable lightness of single neuron non-linearities in lear
 ning - Yasser Roudi\, Kavli Institute for Systems Neuroscience
DTSTART:20230207T133000Z
DTEND:20230207T143000Z
UID:TALK196780@talks.cam.ac.uk
CONTACT:Samuel Eckmann
DESCRIPTION:A large body of work in the theory of neural networks (artific
 ial or biological) has been performed on neural networks comprised of simp
 le activation functions\, prominently\, binary units. Analysing such netwo
 rks has led to some general conclusions. For instance\, there is long held
  consensus that local biological learning mechanisms such as Hebbian learn
 ing are very inefficient compared to iterative non-local learning rules us
 ed in machine learning. In this talk\, I will show that when it comes to m
 emory operations such a conclusion is an artefact of analysing networks of
  binary neurons: when neurons with graded response\, more reminiscent of t
 he response of real neurons\, are considered\, memory storage in neural ne
 tworks with Hebbian learning can be very efficient and close to the optima
 l performance. Turning to artificial neural networks\, I will discuss how 
 non-linearities influence the ability of Restricted Boltzmann Machines to 
 express probability distributions over the visible nodes and how this affe
 cts learnability in these machines.\n\nRefs:\nSchönsberg\, F.\, Roudi\, Y
 .\, & Treves\, A. (2021). Efficiency of local learning rules in threshold-
 linear associative networks. Physical Review Letters\, 126(1)\, 018301.\nB
 ulso\, N.\, & Roudi\, Y. (2021). Restricted boltzmann machines as models o
 f interacting variables. Neural Computation\, 33(10)\, 2646-2681.
LOCATION:CBL Seminar Room (in person)\, Engineering Department\, 4th floor
  Baker building
END:VEVENT
END:VCALENDAR
