COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Computational Neuroscience > Computational Neuroscience Journal Club
Computational Neuroscience Journal ClubAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Puria Radmard. Please join us for our fortnightly Computational Neuroscience journal club on Tuesday 24th October at 2pm UK time in the CBL seminar room, or online on zoom. The title is ‘Dynamics of Learning in Deep Linear Networks’, presented by Rui Xia and Edward Young. Zoom information: https://eng-cam.zoom.us/j/84204498431?pwd=Um1oU284b1YxWThObGw4ZU9XZitWdz09 Meeting ID: 842 0449 8431 Passcode: 684140 Deep neural networks work incredibly well in a range of applications. Theoretical understanding of their learning dynamics and other mysteries is required. One such mystery in deep learning is the generalization ability, even for overparameterized models. A view by which gradient-based optimization induces an implicit regularization – a bias towards models of low complexity – has arisen. We will first present [1], which studies the implicit regularization of gradient descent over deep linear neural networks such that adding depth to a matrix factorization enhances an implicit tendency towards low-rank solutions. We will then cover [2] which derives analytically closed form expressions for learning dynamics in such deep linear neural networks. Time permitting, we will examine [3], which applies this theory to the development of semantic categories. [1] Arora, S., Cohen, N., Hu, W., & Luo, Y. (2019). Implicit regularization in deep matrix factorization. Advances in Neural Information Processing Systems, 32. [2] Saxe, A. M., McClelland, J. L., & Ganguli, S. (2013). Exact solutions to the nonlinear dynamics of learning in deep linear neural networks. arXiv preprint arXiv:1312.6120. [3] Saxe, A. M., McClelland, J. L., & Ganguli, S. (2019). A mathematical theory of semantic development in deep neural networks. Proceedings of the National Academy of Sciences, 116(23), 11537-11546. This talk is part of the Computational Neuroscience series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCellular Genetic Disease Seminar Nursing Essay Writing Service Rede LecturesOther talksEvolution of sensory-driven behaviours in blind cavefish Luxury technologies and collective action: the REVERSEACTION project Pantastic archaeology in the northern Namib Sand Sea Wildlife from space: detecting, monitoring and studying wildlife using satellite imagery Predicting outcomes for patients with dementia requiring psychiatric inpatient care Etiologic studies of gastrointestinal cancers: opportunities & implications |