Journal Club: "Reducing the Dimensionality of Data with Neural Networks"
- ๐ค Speaker: O. Stegle
- ๐ Date & Time: Friday 08 December 2006, 12:30 - 13:30
- ๐ Venue: TCM Seminar Room, Cavendish Laboratory, Department of Physics
Abstract
G. E. Hinton* and R. R. Salakhutdinov High-dimensional data can be converted to low-dimensional codes by training a multilayer network with a small central layer to reconstruct high-dimensional input vectors. Gradient descent can be used for fine-tuning the weights in such โโautoencoderโโ networks, but this works well the initial weights are close to a good solution. We describe an effective way of initializing weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data. —— http://www.sciencemag.org/cgi/reprint/313/5786/504.pdf
please also look at supporting online Material:
Series This talk is part of the Machine Learning Journal Club series.
Included in Lists
- Cambridge talks
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Interested Talks
- Machine Learning Journal Club
- Machine Learning Summary
- ML
- Quantum Matter Journal Club
- rp587
- TCM Seminar Room, Cavendish Laboratory, Department of Physics
- TQS Journal Clubs
- yk373's list
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

O. Stegle
Friday 08 December 2006, 12:30-13:30