BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:How the strength of the inductive bias affects the generalization 
 performance of interpolators - Dr Fanny Yang\, ETH Zurich
DTSTART:20221123T140000Z
DTEND:20221123T150000Z
UID:TALK179117@talks.cam.ac.uk
CONTACT:Prof. Ramji Venkataramanan
DESCRIPTION:Interpolating models have recently gained popularity in the st
 atistical learning community due to common practices in modern machine lea
 rning: complex models achieve good generalization performance despite inte
 rpolating high-dimensional training data. In this talk\, we prove generali
 zation bounds for high-dimensional linear models that interpolate noisy da
 ta generated by a sparse ground truth. In particular\, we first show that 
 minimum-l1-norm interpolators achieve high-dimensional asymptotic consiste
 ncy at a logarithmic rate. Further\, as opposed to the regularized or nois
 eless case\, for min-lp-norm interpolators with 1<p<2 we surprisingly obta
 in polynomial rates. Our results suggest a new trade-off for interpolating
  models: a stronger inductive bias encourages a simpler structure better a
 ligned with the ground truth at the cost of an increased variance. We fina
 lly discuss our latest results\, where we show that this phenomenon also h
 olds for nonlinear models.\n\n
LOCATION:MR5\, CMS Pavilion A
END:VEVENT
END:VCALENDAR
