BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Information Geometry — Natural Gradient Descent - Andy Lin\, MLG
DTSTART:20221123T110000Z
DTEND:20221123T123000Z
UID:TALK192956@talks.cam.ac.uk
CONTACT:86986
DESCRIPTION:Information geometry applies fundamental concepts of different
 ial geometry to probability theory and statistics to study statistical man
 ifolds\, which are Riemannian manifolds in which each point in the manifol
 d corresponds to a probability distribution.\nThe perhaps most prominent a
 pplication of information geometry in machine learning is natural gradient
  descent (NGD)\, which can be described as 'gradient descent which respect
 s the curvature of the statistical manifold'.\nIn this session\, we will b
 e looking at NGD from different perspectives.\nIn particular\, we will con
 sider NGD as 1) preconditioning the gradient with the inverse Fisher infor
 mation matrix\, 2) second-order optimisation with Newton's method\, 3) the
  result of stating gradient descent as a valid tensor equation\, and (bonu
 s) *) mirror descent in a dual Riemannian manifold.\nPrevious knowledge of
  differential geometry is NOT required.\n\nReading (suggested\, but not re
 quired):\nMartens\, J. (2020). “New Insights and Perspectives on the Nat
 ural Gradient Method”.  Journal of Machine Learning Research\, Volume 
 21\, Issue 146.\n(Sections 5\, 8 - 8.1\, 9.2\, ~6 pages)\n\nRaskutti\, G.\
 , Mukherjee\, S. (2015). “The information geometry of mirror descent”.
   IEEE Transactions on Information Theory\, Volume 61\, Issue 3.\n(Secti
 ons 1\, 2\, ~3 pages)
LOCATION:Cambridge University Engineering Department\, CBL Seminar room BE
 4-38.
END:VEVENT
END:VCALENDAR
