BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Coordinate Descent on the Orthogonal Group for Recurrent Neural Ne
 twork Training - Estelle Massart (Oxford)
DTSTART:20211103T140000Z
DTEND:20211103T150000Z
UID:TALK164197@talks.cam.ac.uk
CONTACT:Hamza Fawzi
DESCRIPTION:To address the poor scalability of training algorithms for ort
 hogonal recurrent neural networks\, we propose to use a coordinate descent
  method on the orthogonal group. This algorithm has a cost per iteration t
 hat evolves linearly with the number of recurrent states\, in contrast wit
 h the cubic dependency of typical algorithms such as stochastic Riemannian
  gradient descent. We numerically show that the Riemannian gradient in rec
 urrent neural network training has an approximately sparse structure. Leve
 raging this observation\, we propose a variant of the proposed  algorithm 
 that relies on Gauss-Southwell coordinate selection. Experiments on a benc
 hmark recurrent neural network training problem show that the proposed app
 roach is a very promising step towards the training of orthogonal recurren
 t neural networks with big architectures.\n\n\nJoin Zoom Meeting \nhttps:/
 /maths-cam-ac-uk.zoom.us/j/93776043287?pwd=UDIrNDdkeUU1NmFtZXpNUzd6ZjRrdz0
 9 \nMeeting ID: 937 7604 3287 \nPasscode: p1Co4skf\n
LOCATION:Virtual (Zoom details under abstract)
END:VEVENT
END:VCALENDAR
