Handling Sparsity via the Horseshoe
- đ¤ Speaker: Carlos Carvalho, University of Chicago Booth School of Business
- đ Date & Time: Friday 24 April 2009, 16:00 - 17:00
- đ Venue: MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
Abstract
This paper presents a general, fully Bayesian framework for sparse supervised-learning problems based on the horseshoe prior. The horseshoe prior is a member of the family of multivariate scale mixtures of normals, and is therefore closely related to widely used approaches for sparse Bayesian learning, including, among others, Laplacian (LASSO) and Student-t priors (relevance vector machines). The advantages of the horseshoe are its robustness at handling unknown sparsity and large outlying signals. These properties are justified theoretically via a representation theorem and accompanied by comprehensive empirical experiments that compare its performance to benchmark alternatives.
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Carlos Carvalho, University of Chicago Booth School of Business
Friday 24 April 2009, 16:00-17:00