COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Applied and Computational Analysis > Greedy-LASSO, Greedy-Net: Generalization and unrolling of greedy sparse recovery algorithms
Greedy-LASSO, Greedy-Net: Generalization and unrolling of greedy sparse recovery algorithmsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Matthew Colbrook. Sparse recovery generally aims at reconstructing a sparse vector, given linear measurements performed via a mixture (or sensing) matrix, typically underdetermined. Greedy (and thresholding) sparse recovery algorithms have known to serve well as a suitable alternative for convex optimization techniques, in particular in low sparsity regimes. In this talk, I take orthogonal matching pursuit (OMP) as an example, and establish a connection between OMP and convex optimization decoders in one side and neural networks on the other side. To achieve the former, we adopt a loss function-based perspective and propose a framework based on OMP that leads to greedy algorithms for a large class of loss functions including the well-known (weighted-)LASSO family, with explicit formulas for the choice of the ``greedy selection criterion”. We show numerically that these greedy algorithms inherit properties of their ancestor convex decoder. In the second part of the talk, we leverage ``softsoring”, to resolve the non-differentiability issue of OMP due to (arg)sorting, in order to derive a differentiable version of OMP that we call ``Soft-OMP”, which we demonstrate numerically and theoretically that is a good approximation for OMP . We then unroll iterations of OMP onto layers of a neural network with weights as semantic trainable parameters that capture the structure within the data. Doing so, we also connect our approach to learning weights in weighted sparse recovery. I will conclude the talk by presenting implications of our framework for other greedy algorithms such as CoSaMP and IHT , and highlight some open problems. This is joint work with Simone Brugiapaglia (Concordia University) and Matthew Colbrook (University of Cambridge). This talk is part of the Applied and Computational Analysis series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsFresh Thinking At Fitz World History Workshop Centre for Research in Children's Literature at CambridgeOther talksThe Cognitive Biology of Language Sedgwick Club Conference 2024 Collaboration and Group work Open Discussion and Soft Launch of a CambRidge Community E. coli and the Game of Clones Kirk Public Lecture: Counting curves: which, how and why |