COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Automating stochastic gradient methods with adaptive batch sizes
Automating stochastic gradient methods with adaptive batch sizesAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact INI IT. VMVW01 - Variational methods, new optimisation techniques and new fast numerical algorithms This talk will address several issues related to training neural networks using stochastic gradient methods. First, we'll talk about the difficulties of training in a distributed environment, and present a new method called centralVR for boosting the scalability of training methods. Then, we'll talk about the issue of automating stochastic gradient descent, and show that learning rate selection can be simplified using “Big Batch” strategies that adaptively choose minibatch sizes. This talk is part of the Isaac Newton Institute Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCambridge Initiative for Musculoskeletal Tissue Engineering Philosophy and Natural Philosophy in the Early Modern Period Entrepreneurship for a Zero Carbon SocietyOther talksMaking Refuge: Flight Building cortical networks: from molecules to function Architecture and the English economy, 1200-1500: a new history of the parish church over the longue durée Revolution and Literature: Volodymyr Vynnychenko's Responses to the Ukrainian Revolution of 1918-1920 An African orient? West Africans in World War Two India, 1943-1947 St Catharine’s Political Economy Seminar - ‘Technological Unemployment: Myth or Reality’ by Robert Skidelsky |