Sparsity pattern aggregation for convex stochastic optimization.
- π€ Speaker: Phillipe Rigollet (Princeton University)
- π Date & Time: Friday 29 October 2010, 16:00 - 17:00
- π Venue: MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
Abstract
Important statistical problems including regression, binary classification and density estimation can be recast as convex stochastic optimization problems when seen from the point of view of statistical aggregation. These convex problems can be numerically solved efficiently in high dimension but may show mediocre statistical performance. One way to overcome this situation consists in assuming that there exists approximate solution, called “sparse”, that are of moderate dimension. This presentation introduces a new method called “exponential screening (ES)” as an alternative to the $\ell_1$-penalization idea, which is currently the most popular way to find these sparse solutions. While $\ell_1$ based methods can be analyzed only under rather stringent assumptions, ES shows optimal statistical performance under fairly general assumptions. Implementation is not straightforward but it can be approximated using the Metropolis algorithm which results in a stochastic greedy algorithm and performs surprisingly well in a simulated problem of sparse recovery.
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Phillipe Rigollet (Princeton University)
Friday 29 October 2010, 16:00-17:00