BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Hopfield Networks: From Neuroscience to Machine Learning and Back 
 - Youjing Yu\; Tim Lin
DTSTART:20250218T150000Z
DTEND:20250218T163000Z
UID:TALK228595@talks.cam.ac.uk
CONTACT:124819
DESCRIPTION:Hopfield networks\, originally introduced in the 1980s\, are r
 ecurrent neural networks that function as content-addressable memory syste
 ms. While classical Hopfield networks have limited capacity\, modern varia
 nts leverage continuous states and attention-like energy functions to achi
 eve exponential storage capacity. The influential work Hopfield Networks i
 s All You Need bridges these advancements to transformer architectures\, h
 ighlighting their significance in deep learning.\nIn the first part of thi
 s talk\, we will trace the evolution of Hopfield networks\, examining thei
 r mathematical foundations and key applications in optimization. We will e
 xplore how these networks have transformed from their original binary stat
 e models to powerful continuous-state systems with deep learning applicati
 ons.\nIn the second part\, we will step back to consider content-addressab
 le memory in the brain\, beginning with the hippocampal memory indexing th
 eory. We will introduce a kernel-based formulation of key-value memory and
  discuss biologically plausible mechanisms for learning and organizing rep
 resentations of queries\, keys\, and values. A key focus will be the recen
 tly proposed Vector-HaSH algorithm (Chandra et al.\, 2025\, Nature)\, whic
 h offers a compelling model for efficient memory retrieval. Finally\, we w
 ill review the main lines of evidence supporting key-value memory structur
 es in the brain\, drawing connections between neuroscience and modern mach
 ine learning.
LOCATION:CBL Seminar Room\, Engineering Department\, 4th floor Baker build
 ing
END:VEVENT
END:VCALENDAR
