BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Optimal algorithms for smooth and strongly convex distributed opti
 mization in networks  - Francis Bach\, INRIA
DTSTART:20180116T130000Z
DTEND:20180116T140000Z
UID:TALK97270@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:In this work\, we determine the optimal convergence rates for 
 strongly convex and smooth distributed optimization in two settings: centr
 alized and decentralized communications over a network. For centralized (i
 .e. master/slave) algorithms\, we show that distributing Nesterov's accele
 rated gradient descent is optimal and achieves a precision in time that de
 pends on the condition number of the (global) function to optimize\, the d
 iameter of the network\, and the time needed to communicate values between
  two neighbors (resp. perform local computations). For decentralized algor
 ithms based on gossip\, we provide the first optimal algorithm\, called th
 e multi-step dual accelerated (MSDA) method\, that achieves the a precisio
 n that depends on the condition number of the local functions and the (nor
 malized) eigengap of the gossip matrix used for communication between node
 s. We then verify the efficiency of MSDA against state-of-the-art methods 
 for two problems: least-squares regression and classification by logistic 
 regression. (joint work with Kevin Scaman\, Sébastien Bubeck\, Yin Tat Le
 e\, and Laurent Massoulié) 
LOCATION:Auditorium\, Microsoft Research Ltd\, 21 Station Road\, Cambridge
 \, CB1 2FB
END:VEVENT
END:VCALENDAR
