University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Stochastic approximation with heavy tailed noise

Stochastic approximation with heavy tailed noise

Add to your list(s) Download to your calendar using vCal

  • UserVivek Borkar (Indian Institute of Technology Bombay)
  • ClockThursday 25 April 2024, 11:45-12:30
  • HouseExternal.

If you have a question about this talk, please contact nobody.

TMLW02 - SGD: stability, momentum acceleration and heavy tails

This talk will first review the so-called `ODE’ (for `Ordinary Differential Equations’) approach for the analysis of stochastic approximation algorithms of which stochastic gradient descent is a special case. Using that as a backdrop, certain analogous results will be presented for a class of algorithms with heavy tailed noise. This is joint work with V. Anantharam of Uni. of California at Berkeley.    

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity