BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:A precise high-dimensional asymptotic theory for Adaboost - Pragya
  Sur (Harvard University)
DTSTART:20210122T160000Z
DTEND:20210122T170000Z
UID:TALK153538@talks.cam.ac.uk
CONTACT:Dr Sergio Bacallado
DESCRIPTION:This talk will introduce a precise high-dimensional asymptotic
  theory for AdaBoost on separable data\, taking both statistical and compu
 tational perspectives. We will consider the common modern setting where th
 e number of features p and the sample size n are both large and comparable
 \, and in particular\, look at scenarios where the data is separable in an
  asymptotic sense. Under a class of statistical models\, we will provide a
 n (asymptotically) exact analysis of the generalization error of AdaBoost\
 , when the algorithm interpolates the training data and maximizes an empir
 ical L1 margin. On the computational front\, we provide a sharp analysis o
 f the stopping time when boosting approximately maximizes the empirical L1
  margin. Our theory provides several insights into properties of Boosting\
 ; for instance\, the larger the dimensionality ratio p/n\, the faster the 
 optimization reaches interpolation. At the heart of our theory lies an in-
 depth study of the maximum L1-margin\, which can be accurately described b
 y a new system of non-linear equations\; we analyze this margin and the pr
 operties of this system\, using Gaussian comparison techniques and a novel
  uniform deviation argument. Time permitting\, I will present a new class 
 of boosting algorithms that correspond to Lq geometry\, for q>1\, together
  with results on their high-dimensional generalization and optimization be
 havior. \n\nThis is based on joint work with Tengyuan Liang.
LOCATION: https://maths-cam-ac-uk.zoom.us/j/92821218455?pwd=aHFOZWw5bzVReU
 NYR2d5OWc1Tk15Zz09
END:VEVENT
END:VCALENDAR
