COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Stochastic approximation with heavy tailed noise
Stochastic approximation with heavy tailed noiseAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact nobody. TMLW02 - SGD: stability, momentum acceleration and heavy tails This talk will first review the so-called `ODE’ (for `Ordinary Differential Equations’) approach for the analysis of stochastic approximation algorithms of which stochastic gradient descent is a special case. Using that as a backdrop, certain analogous results will be presented for a class of algorithms with heavy tailed noise. This is joint work with V. Anantharam of Uni. of California at Berkeley. This talk is part of the Isaac Newton Institute Seminar Series series. This talk is included in these lists:This talk is not included in any other list Note that ex-directory lists are not shown. |
Other listsSciBar net media Accounting Seminars, CJBSOther talksStability of continuous-time processes with jumps Part 3: Recurrence and Ergodicity Recent work on the Erdos-Hajnal Conjecture Open Discussion and Soft Launch of a CambRidge Community Transportation Cost spaces and their embeddings into $L_1$ The role of reward in language learning |