A quick way to learn a mixture of exponentially many linear models
- đ¤ Speaker: Geoffrey Hinton, Canadian Institute for Advanced Research & University of Toronto
- đ Date & Time: Monday 15 June 2009, 15:00 - 16:00
- đ Venue: TCM Seminar Room, Cavendish Laboratory, Department of Physics
Abstract
Mixtures of linear models can be used to model data that lies on or near a smooth non-linear manifold. A proper Bayesian treatment can be applied to toy data to determine the number of models in the mixture and the dimensionality of each linear model but this neurally uninspired approach completely misses the main problem: Real data with many degrees of freedom in the manifold requires a mixture with an exponential number of components. It is quite easy to fit mixtures of 2^1000 linear models by using a few tricks: First, each linear model selects from a pool of shared factors using the selection rule that factors with negative values are ignored. Second, undirected linear models are used to simplify inference and the models are trained by matching pairwise statistics. Third, Poisson noise is used to implement L1 regularization of the activities of the factors. The factors are then threshold linear neurons with Poisson noise and their positive integer activities are very sparse. Preliminary results suggest that these exponentially large mixtures work very well as modules for greedy, layer-by-layer learning of deep networks. Even with one eye closed, they outperform Support Vector machines for recognizing 3-D images of objects from the NORB database.
Series This talk is part of the Inference Group series.
Included in Lists
- All Cavendish Laboratory Seminars
- All Talks (aka the CURE list)
- Biology
- Cambridge Neuroscience Seminars
- Cambridge talks
- Centre for Health Leadership and Enterprise
- Chris Davis' list
- dh539
- dh539
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group
- Inference Group Summary
- Interested Talks
- Joint Machine Learning Seminars
- Life Science
- Life Sciences
- Machine Learning Summary
- ME Seminar
- ML
- Neurons, Fake News, DNA and your iPhone: The Mathematics of Information
- Neuroscience
- Neuroscience Seminars
- Neuroscience Seminars
- Required lists for MLG
- rp587
- School of Physical Sciences
- Stem Cells & Regenerative Medicine
- TCM Seminar Room, Cavendish Laboratory, Department of Physics
- Thin Film Magnetic Talks
- yk373's list
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Monday 15 June 2009, 15:00-16:00