BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Computational Neuroscience Journal Club - Jasmine Stone (Universit
 y of Cambridge)
DTSTART:20201117T150000Z
DTEND:20201117T163000Z
UID:TALK154054@talks.cam.ac.uk
CONTACT:Jake Stroud
DESCRIPTION:Please join us for our fortnightly journal club online via zoo
 m where two presenters will jointly present a topic together.\n\nZoom info
 rmation:\nhttps://us02web.zoom.us/j/81138977348?pwd=d0RlQ0QwTHJydzdyR2ttZW
 93MU5Sdz09\n\nMeeting ID: 811 3897 7348\n\nPasscode: 095299\n\nThe next to
 pic is 'An overview of linear gaussian models and dimensionality reduction
  techniques':\n\nFactor analysis\, principal components analysis\, mixture
 s of gaussian clusters\, Kalman filter models\, hidden Markov models\, slo
 w feature analysis\, linear discriminant analysis\, canonical correlations
  analysis\, undercomplete independent component analysis\, and linear regr
 ession are all linear models and methods used throughout neuroscience to m
 ake sense of high-dimensional data. Phew\, what a long list of methods - i
 t can be difficult to make sense of the differences and similarities betwe
 en all of these since they were developed across fields and over time. We 
 present multiple papers that attempt to unify these methods in a common fr
 amework. We will first discuss Roweis and Ghahramani’s “A unifying rev
 iew of linear Gaussian models” which unifies many of these methods as un
 supervised learning under basic generative models. We then present Turner 
 and Sahani’s “A maximum-likelihood interpretation for slow feature ana
 lysis\,” which establishes a probabilistic interpretation of the slow fe
 ature analysis (SFA) algorithm for time series and subsequently develops n
 ovel extensions of SFA. Finally\, we present Cunningham and Ghahramani’s
  “Linear Dimensionality Reduction: Survey\, Insights\, and Generalizatio
 ns” which approaches these methods as optimization programs over matrix 
 manifolds\, addressing and analyzing the suboptimality of certain eigenvec
 tor approaches.These frameworks help connect the multitude of linear model
 s and dimensionality reduction techniques\, can suggest new developments a
 nd approaches\, and can provide a way to choose and distinguish between th
 e different options.\n\nPapers:\na. Roweis\, Sam\, and Zoubin Ghahramani. 
 "A unifying review of linear Gaussian models." Neural computation 11.2 (19
 99): 305-345.\nhttps://cs.nyu.edu/~roweis/papers/NC110201.pdf\n\nb. Turner
  R\, Sahani M. A maximum-likelihood interpretation for slow feature analys
 is. Neural Comput. 2007\;19(4):1022-1038.doi:10.1162/neco.2007.19.4.1022.\
 nhttp://www.gatsby.ucl.ac.uk/~turner/Publications/turner-and-sahani-2007a.
 pdf\n\nc. J. Cunningham and Z. Ghahramani. “Linear Dimensionality Reduct
 ion: Survey\, Insights\, and Generalizations”\, Journal of Machine Learn
 ing Research\, 16\,2859-2900\, 2015.\nhttps://jmlr.org/papers/volume16/cun
 ningham15a/cunningham15a.pdf
LOCATION:Online on Zoom
END:VEVENT
END:VCALENDAR
