BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:CMA-ES – a Stochastic Second-Order Method for Function-Value Fre
 eNumerical Optimization - Nikolaus Hansen\, INRIA
DTSTART:20111109T140000Z
DTEND:20111109T150000Z
UID:TALK34378@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:We consider black-box optimization with little assumptions on 
 the underlying objective function. Further\, we consider sampling from a d
 istribution to obtain new candidate solutions. Under mild assumptions\, so
 lving the original black-box optimization problem coincides with optimizin
 g a parametrized family of distributions of our choice. Choosing the famil
 y of multivariate normal distributions on the continuous search domain\, a
  natural gradient descent on this family leads to an instantiation of the 
 so-called CMA-ES algorithm (covariance matrix adaptation evolution strateg
 y). In this talk\, the continuous black-box optimization problem will be i
 ntroduced and the CMA-ES algorithm will be illustrated. The CMA-ES adapts 
 a second-order model of the underlying objective function. On convex-quadr
 atic functions\, the resulting covariance matrix resembles the inverse Hes
 sian matrix of the function. In contrast to Quasi-Netwon methods\, this ca
 n be accomplished derivative- and even function-value free. The CMA-ES rev
 eals the same invariance properties as the famous Nelder-Mead simplex down
 hill method\, is robust and works reliably not only in low dimensions and 
 is surprisingly efficient on convex as well as non-convex\, highly ill-con
 ditioned problems.
LOCATION:Small lecture theatre\, Microsoft Research Ltd\, 7 J J Thomson Av
 enue (Off Madingley Road)\, Cambridge
END:VEVENT
END:VCALENDAR
