University of Cambridge > Talks.cam > CCIMI Seminars > Coordinate Descent on the Orthogonal Group for Recurrent Neural Network Training

Coordinate Descent on the Orthogonal Group for Recurrent Neural Network Training

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hamza Fawzi.

To address the poor scalability of training algorithms for orthogonal recurrent neural networks, we propose to use a coordinate descent method on the orthogonal group. This algorithm has a cost per iteration that evolves linearly with the number of recurrent states, in contrast with the cubic dependency of typical algorithms such as stochastic Riemannian gradient descent. We numerically show that the Riemannian gradient in recurrent neural network training has an approximately sparse structure. Leveraging this observation, we propose a variant of the proposed algorithm that relies on Gauss-Southwell coordinate selection. Experiments on a benchmark recurrent neural network training problem show that the proposed approach is a very promising step towards the training of orthogonal recurrent neural networks with big architectures.

Join Zoom Meeting https://maths-cam-ac-uk.zoom.us/j/93776043287?pwd=UDIrNDdkeUU1NmFtZXpNUzd6ZjRrdz09 Meeting ID: 937 7604 3287 Passcode: p1Co4skf

This talk is part of the CCIMI Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity