University of Cambridge > > Isaac Newton Institute Seminar Series > Sharpness, Restart and Compressed Sensing Performance

Sharpness, Restart and Compressed Sensing Performance

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact

STSW01 - Theoretical and algorithmic underpinnings of Big Data

Joint work with Vincent Roulet (University of Washington) and Nicolas Boumal (Princeton University). We show that several classical quantities controlling compressed sensing performance directly match parameters controlling algorithmic complexity. We first describe linearly convergent restart schemes on first-order methods using a sharpness assumption. The Lojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. For sparse recovery problems, sharpness at the optimum can be written as a condition number, given by the ratio between true signal sparsity and the largest signal size that can be recovered by the observation matrix. Overall, this means that in compressed sensing problems, a single parameter directly controls both computational complexity and recovery performance.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2018, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity