COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Scalable Approaches to Self-Supervised Learning using Spectral Analysis
Scalable Approaches to Self-Supervised Learning using Spectral AnalysisAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact James Allingham. Zoom link available upon request (it is sent out on our mailing list, eng-mlg-rcc [at] lists.cam.ac.uk). Sign up to our mailing list for easier reminders. Learning the principal eigenfunctions of an operator is a fundamental problem in various machine learning tasks, from representation learning to Gaussian processes. However, traditional non-parametric solutions suffer from scalability issues—rendering them impractical on large datasets. This reading group will discuss parametric approaches to approximating eigendecompositions using neural networks. In particular, Spectral Inference Networks (SpIN) offer a scalable method for approximating eigenfunctions of symmetric operators on high-dimensional function spaces using bi-level optimization methods and gradient masking (Pfau et al., 2019). A recent improvement on SpIN, called NeuralEF, focuses on approximating eigenfunction expansions of kernels (Deng et al., 2022a). The method is applied to modern neural-network based kernels (GP-NN and NTK ) as well as scaling up the linearised Laplace approximation for deep networks (Deng et al., 2022a). Finally, self-supervised learning can be expressed in terms of approximating a contrastive kernel, which allows NeuralEF to learn structured representations (Deng et al., 2022b). References David Pfau, Stig Petersen, Ashish Agarwal, David G. T. Barrett,and Kimberly L. Stachenfeld. “Spectral inference networks: Unifying deep and spectral learning.” ICLR (2019). Zhijie Deng, Jiaxin Shi, and Jun Zhu. “NeuralEF: Deconstructing kernels by deep neural networks.” ICML (2022a). Zhijie Deng, Jiaxin Shi, Hao Zhang, Peng Cui, Cewu Lu, Jun Zhu. “Neural Eigenfunctions Are Structured Representation Learners.” arXiv preprint arXiv:2210.12637 (2022b). This talk is part of the Machine Learning Reading Group @ CUED series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsSeminars in Ageing Research exam Engineering Design Centre SeminarsOther talksBrain and Serum profile of the African Giant Rat brain (Cricetomys gambianus) after natural exposure to heavy metal environmental pollution in the Nigerian Niger Delta Recent advances on preconditioning for BEM on complex geometries. Somatic mutation of metabolism genes in non-alcoholic fatty liver disease Group Photo, Morning Coffee and Posters Statistical Scalability for Data Streams: Recent Advances, Applications and Impact Uniqueness and support theorems for analytic double fibration transforms |