BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Gradient methods for huge-scale optimization problems - Prof Yurii
  Nesterov\, Catholic University of Louvain\, Belgium
DTSTART:20140527T163000Z
DTEND:20140527T173000Z
UID:TALK51798@talks.cam.ac.uk
CONTACT:Rachel Fogg
DESCRIPTION:We consider a new class of huge-scale problems\, the problems 
 with sparse gradients. The most important functions of this type are piece
 -wise linear.\nFor optimization problems with uniform sparsity of correspo
 nding linear operators\, we suggest a very efficient implementation of the
  iterations\, which total cost depends logarithmically in the dimension. T
 his technique is based on a recursive update of the results of matrix/vect
 or products and the values of symmetric functions. It works well\, for exa
 mple\, for matrices with few nonzero diagonals and for max-type functions.
 \n\nWe show that the updating technique can be efficiently coupled with th
 e simplest gradient methods. Similar results can be obtained for a new non
 - smooth random variant of a coordinate descent scheme. We present also th
 e promising results of preliminary computational experiments and discuss e
 xtensions of this technique.\n\nBiography:\nYurii Nesterov is a professor 
 at the Catholic University of Louvain\,  Belgium\, where he is a member of
  the Center for Operations Research and Econometrics (CORE). He is the aut
 hor of 4 monographs and more than 80 refereed papers in leading optimizati
 on journals. He was awarded with the Dantzig Prize 2000 given by SIAM and 
 the Mathematical Programming Society (for research having a major impact o
 n the field of mathematical programming)\, the John von Neumann Theory Pri
 ze 2009 given by INFORMS\, the Charles Broyden prize 2010 (for the best pa
 per in Optimization Methods and Software journal)\, and the Honorable Fran
 cqui Chair (University of Liège\, 2011-2012). \nThe main direction of his
  research is the development of efficient numerical methods for convex and
  nonconvex optimization problems supported by a global complexity analysis
 . The most important results are obtained for general interior-point metho
 ds (theory of self-concordant functions)\, fast gradient methods (smoothin
 g technique)\, and global complexity analysis of the second-order schemes 
 (cubic regularization of the Newton's method).\n
LOCATION:Cambridge University Engineering Department\, LR6
END:VEVENT
END:VCALENDAR
