Computational Neuroscience Journal Club
- 👤 Speaker: Guillaume Hennequin (University of Cambridge)
- 📅 Date & Time: Tuesday 27 February 2024, 15:00 - 17:00
- 📍 Venue: CBL Seminar Room, Engineering Department, 4th floor Baker building
Abstract
Please join us for our Computational Neuroscience journal club on Tuesday 27th February at 3pm UK time in the CBL seminar room
The title is “Alternatives to Backpropagation”, presented by Youjing Yu and Guillaume Hennequin.
Summary:
Backpropagation is one of the most widely-used algorithms for training neural networks. However, despite its popularity, there are several arguments against the use of backpropagation, one of the most important being its biological implausibility. In this journal club meeting, we are going to take a look at some alternatives developed to backpropagation.
We start by digesting the Forward-Forward algorithm proposed by Geoffrey Hinton [1]. Instead of running one forward pass through the network followed by one backward pass as in backpropagation, the Forward-Forward algorithm utilises two forward passes, one with positive, real data and another with negative, fake data. Each layer in the network has its own objective function, which is to generate high “goodness” for positive data and low “goodness” for negative data. We will dive into the working principles of the algorithm, its effectiveness on small problems and the associated limitations.
Next, we will present another cool idea that has been independently re-discovered by several labs, and was perhaps most cleanly articulated in Meulemans et al., NeurIPS 2022. This idea phrases learning as a least-control problem: a feedback control loop is set up that continuously keeps the learning system (e.g. neural network) in a state of minimum loss, and learning becomes the problem of progressively doing away with controls. As it turns out, gradient information is available in the control signals themselves, such that learning becomes local. We will give a general introduction and history of this idea, and look into Meulemans et al. in some detail.
[1] Hinton, Geoffrey. “The forward-forward algorithm: Some preliminary investigations.” arXiv preprint arXiv:2212.13345 (2022). [2] Meulemans, Alexander, et al. “The least-control principle for local learning at equilibrium.” Advances in Neural Information Processing Systems 35 (2022): 33603-33617.
Series This talk is part of the Computational Neuroscience series.
Included in Lists
- All Talks (aka the CURE list)
- Biology
- Biology
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Neuroscience Seminars
- CamBridgeSens
- Cambridge talks
- CBL important
- CBL Seminar Room, Engineering Department, 4th floor Baker building
- Chris Davis' list
- Computational and Biological Learning Seminar Series
- Computational Neuroscience
- custom
- dh539
- dh539
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Life Science
- Life Science Interface Seminars
- Life Sciences
- Life Sciences
- ME Seminar
- my_list
- ndk22's list
- Neuroscience
- Neuroscience Seminars
- Neuroscience Seminars
- ob366-ai4er
- other talks
- Quantum Matter Journal Club
- Required lists for MLG
- rp587
- se456's list
- Stem Cells & Regenerative Medicine
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Tuesday 27 February 2024, 15:00-17:00