BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Minimum L1-norm interpolators: Precise asymptotics and multiple de
 scent - Yuting Wei (University of Pennsylvania)
DTSTART:20220506T130000Z
DTEND:20220506T140000Z
UID:TALK173333@talks.cam.ac.uk
CONTACT:Qingyuan Zhao
DESCRIPTION:An evolving line of machine learning works observe empirical e
 vidence that suggests interpolating estimators --- the ones that achieve z
 ero training error --- may not necessarily be harmful. In this talk\, we p
 ursue theoretical understanding for an important type of interpolators: th
 e minimum L1-norm interpolator\, which is motivated by the observation tha
 t several learning algorithms favor low L1-norm solutions in the over-para
 meterized regime. Concretely\, we consider the noisy sparse regression mod
 el under Gaussian design\, focusing on linear sparsity and high-dimensiona
 l asymptotics (so that both the number of features and the sparsity level 
 scale proportionally with the sample size).\n\nWe observe\, and provide ri
 gorous theoretical justification for\, a curious multi-descent phenomenon\
 ; that is\, the generalization risk of the minimum L1-norm interpolator un
 dergoes multiple (and possibly more than two) phases of descent and ascent
  as one increases the model capacity. This phenomenon stems from the speci
 al structure of the minimum L1-norm interpolator as well as the delicate i
 nterplay between the over-parameterized ratio and the sparsity\,  thus unv
 eiling a fundamental distinction in geometry from the minimum L2-norm inte
 rpolator. Our finding is built upon an exact characterization of the risk 
 behavior\, which is governed by a system of two non-linear equations with 
 two unknowns.
LOCATION:MR12\, Centre for Mathematical Sciences
END:VEVENT
END:VCALENDAR
