BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Minimum L1-norm interpolators: Precise asymptotics and multiple de
 scent - Yuting Wei (University of Pennsylvania)
DTSTART:20220424T130000Z
DTEND:20220424T140000Z
UID:TALK173318@talks.cam.ac.uk
CONTACT:Qingyuan Zhao
DESCRIPTION:An evolving line of machine learning works observes empirical 
 evidence that suggests interpolating estimators --- the ones that achieve 
 zero training error --- may not necessarily be harmful. In this talk\, we 
 pursue a theoretical understanding for an important type of interpolators:
  the minimum L1-norm interpolator\, which is motivated by the observation 
 that several learning algorithms favor low L1-norm solutions in the over-p
 arameterized regime. Concretely\, we consider the noisy sparse regression 
 model under Gaussian design\, focusing on linear sparsity and high-dimensi
 onal asymptotics (so that both the number of features and the sparsity lev
 el scale proportionally with the sample size).\n\nWe observe\, and provide
  rigorous theoretical justification for\, a curious multi-descent phenomen
 on\; that is\, the generalization risk of the minimum L1-norm interpolator
  undergoes multiple (and possibly more than two) phases of descent and asc
 ent as one increases the model capacity. This phenomenon stems from the sp
 ecial structure of the minimum L1-norm interpolator as well as the delicat
 e interplay between the over-parameterized ratio and the sparsity\,  thus 
 unveiling a fundamental distinction in geometry from the minimum L2-norm i
 nterpolator. Our finding is built upon an exact characterization of the ri
 sk behavior\, which is governed by a system of two non-linear equations wi
 th two unknowns.
LOCATION:MR12
END:VEVENT
END:VCALENDAR
