University of Cambridge > Talks.cam > Statistics > Implicit Regularization for Optimal Sparse Recovery

Implicit Regularization for Optimal Sparse Recovery

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr Sergio Bacallado.

We present an implicit regularization scheme for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption. For a given parameterization yielding a non-convex optimization problem, we show that prescribed choices of initialization, step size and stopping time yield a statistically and computationally optimal algorithm that achieves the minimax rate with the same cost required to read the data up to poly-logarithmic factors. Beyond minimax optimality, we show that our algorithm adapts to instance difficulty and yields a dimension-independent rate when the signal-to-noise ratio is high enough. We validate our findings with numerical experiments and compare our algorithm against explicit $\ell_1$ penalization. Going from hard instances to easy ones, our algorithm is seen to undergo a phase transition, eventually matching least squares with an oracle knowledge of the true support.

(based on joint work with Patrick Rebeschini and Tomas Vaskevicius)

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity