How the strength of the inductive bias affects the generalization performance of interpolators
- đ¤ Speaker: Dr Fanny Yang, ETH Zurich đ Website
- đ Date & Time: Wednesday 23 November 2022, 14:00 - 15:00
- đ Venue: MR5, CMS Pavilion A
Abstract
Interpolating models have recently gained popularity in the statistical learning community due to common practices in modern machine learning: complex models achieve good generalization performance despite interpolating high-dimensional training data. In this talk, we prove generalization bounds for high-dimensional linear models that interpolate noisy data generated by a sparse ground truth. In particular, we first show that minimum-l1-norm interpolators achieve high-dimensional asymptotic consistency at a logarithmic rate. Further, as opposed to the regularized or noiseless case, for min-lp-norm interpolators with 1
Series This talk is part of the Information Theory Seminar series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- CMS Events
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Hanchen DaDaDash
- Information Theory Seminar
- Interested Talks
- MR5, CMS Pavilion A
- School of Physical Sciences
- Statistical Laboratory info aggregator
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Dr Fanny Yang, ETH Zurich 
Wednesday 23 November 2022, 14:00-15:00