BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:DISCO: Dynamical Integration Systems for Convergence Optimisation 
 in Distributed Low-Communication Training - Finn Anderson 
DTSTART:20251107T110000Z
DTEND:20251107T120000Z
UID:TALK239950@talks.cam.ac.uk
CONTACT:Sally Matthews
DESCRIPTION:Distributed training faces fundamental challenges from client 
 heterogeneity in compute\, memory\, and network conditions. Existing appro
 aches use staleness-dependent decay\, per-client adjustments\, or distance
 -weighted averaging\, but often lack substantial convergence guarantees. I
  present DISCO\, a control-theoretic framework that recasts federated lear
 ning as a linear time-delay system. Using Lyapunov stability analysis\, DI
 SCO derives lightweight online adjustments with verifiable convergence bou
 nds. Deployed on a Raspberry Pi 4 cluster\, DISCO achieves 3.0–4.0× fas
 ter time-to-accuracy across text classification benchmarks. This work demo
 nstrates how dynamical-systems theory enables provably efficient federated
  learning on commodity hardware.\n\nBio: Finn is an Engineer at Cedana AI 
 with an MSc in Computer Science from UCL. His background spans machine lea
 rning and financial mathematics\, with a focus on distributed training sys
 tems. His research explores homomorphic encryption\, hardware acceleration
 \, and distributed algorithms for efficient training at scale. Finn is par
 ticularly passionate about hardware-aware optimization and designing train
 ing workloads for commodity devices.
LOCATION:Computer Lab\, SS03
END:VEVENT
END:VCALENDAR
