Computational Neuroscience Journal Club
- š¤ Speaker: Jasmine Stone (University of Cambridge)
- š Date & Time: Tuesday 17 November 2020, 15:00 - 16:30
- š Venue: Online on Zoom
Abstract
Please join us for our fortnightly journal club online via zoom where two presenters will jointly present a topic together.
Zoom information: https://us02web.zoom.us/j/81138977348?pwd=d0RlQ0QwTHJydzdyR2ttZW93MU5Sdz09
Meeting ID: 811 3897 7348
Passcode: 095299
The next topic is ‘An overview of linear gaussian models and dimensionality reduction techniques’:
Factor analysis, principal components analysis, mixtures of gaussian clusters, Kalman filter models, hidden Markov models, slow feature analysis, linear discriminant analysis, canonical correlations analysis, undercomplete independent component analysis, and linear regression are all linear models and methods used throughout neuroscience to make sense of high-dimensional data. Phew, what a long list of methods – it can be difficult to make sense of the differences and similarities between all of these since they were developed across fields and over time. We present multiple papers that attempt to unify these methods in a common framework. We will first discuss Roweis and Ghahramaniās āA unifying review of linear Gaussian modelsā which unifies many of these methods as unsupervised learning under basic generative models. We then present Turner and Sahaniās āA maximum-likelihood interpretation for slow feature analysis,ā which establishes a probabilistic interpretation of the slow feature analysis (SFA) algorithm for time series and subsequently develops novel extensions of SFA . Finally, we present Cunningham and Ghahramaniās āLinear Dimensionality Reduction: Survey, Insights, and Generalizationsā which approaches these methods as optimization programs over matrix manifolds, addressing and analyzing the suboptimality of certain eigenvector approaches.These frameworks help connect the multitude of linear models and dimensionality reduction techniques, can suggest new developments and approaches, and can provide a way to choose and distinguish between the different options.
Papers: a. Roweis, Sam, and Zoubin Ghahramani. “A unifying review of linear Gaussian models.” Neural computation 11.2 (1999): 305-345. https://cs.nyu.edu/roweis/papers/NC110201.pdf
b. Turner R, Sahani M. A maximum-likelihood interpretation for slow feature analysis. Neural Comput. 2007;19(4):1022-1038.doi:10.1162/neco.2007.19.4.1022. http://www.gatsby.ucl.ac.uk/turner/Publications/turner-and-sahani-2007a.pdf
c. J. Cunningham and Z. Ghahramani. āLinear Dimensionality Reduction: Survey, Insights, and Generalizationsā, Journal of Machine Learning Research, 16,2859-2900, 2015. https://jmlr.org/papers/volume16/cunningham15a/cunningham15a.pdf
Series This talk is part of the Computational Neuroscience series.
Included in Lists
- All Talks (aka the CURE list)
- Biology
- Biology
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Neuroscience Seminars
- CamBridgeSens
- Cambridge talks
- CBL important
- Chris Davis' list
- Computational and Biological Learning Seminar Series
- Computational Neuroscience
- custom
- dh539
- dh539
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Life Science
- Life Science Interface Seminars
- Life Sciences
- Life Sciences
- ME Seminar
- my_list
- ndk22's list
- Neuroscience
- Neuroscience Seminars
- Neuroscience Seminars
- ob366-ai4er
- Online on Zoom
- other talks
- Quantum Matter Journal Club
- Required lists for MLG
- rp587
- se456's list
- Stem Cells & Regenerative Medicine
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Tuesday 17 November 2020, 15:00-16:30