Sharp bounds for compressive learning
- đ¤ Speaker: Ata Kaban, University of Birmingham
- đ Date & Time: Friday 17 January 2014, 16:00 - 17:00
- đ Venue: MR12, Centre for Mathematical Sciences, Wilberforce Road, Cambridge
Abstract
The first part of the talk will derive sharp bounds on the generalization error of a generic linear classifier trained by empirical risk minimization on randomly-projected data. We make no restrictive assumptions on the data—such as sparsity, separability, or distributional assumptions. Instead, by using elementary techniques, we derive the exact probability of label flipping under Gaussian random projection and use this to bound the effect of random projection on the generalisation error of the compressive classifier. The second part of the talk will present an analogous strategy for regression. This provides a new analysis of the excess risk of compressive linear least squares, which removes a spurious log(N) factor from previous bounds (where N is the number of training points). In addition to tightness, these new bounds have a clear interpretation and reveal meaningful structural properties of the learning problem that make them solvable effectively in a small dimensional random subspace.
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, Centre for Mathematical Sciences, Wilberforce Road, Cambridge
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Friday 17 January 2014, 16:00-17:00