Large-scale convex optimization for machine learning
- 👤 Speaker: Francis Bach, INRIA, Laboratoire d'Informatique de l'Ecole Normale Supérieure
- 📅 Date & Time: Friday 04 May 2012, 16:00 - 17:00
- 📍 Venue: MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
Abstract
Many machine learning and signal processing problems are traditionally cast as convex optimization problems. A common difficulty in solving these problems is the size of the data, where there are many observations (“large n”) and each of these is large (“large p”). In this setting, online algorithms which pass over the data only once, are usually preferred over batch algorithms, which require multiple passes over the data. In this talk, I will present several recent results, showing that in the ideal infinite-data setting, online learning algorithms based on stochastic approximation should be preferred (both in terms of running speed and generalization performance), but that in the practical finite-data setting, an appropriate combination of batch and online algorithms leads to unexpected behaviors, such as a linear convergence rate with an iteration cost similar to stochastic gradient descent.
This is joint work with Nicolas Le Roux, Eric Moulines and Mark Schmidt.
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Friday 04 May 2012, 16:00-17:00