Computational Complexity versus Statistical Performance on Sparse Recovery Problems
SeminarSparse recovery problems have received a lot of attention from various perspectives. On one side, an extensive literature explores the limits of recovery performance. On the other side, a long list of algorithms now solve these problems very efficiently. Early on, it was noticed empirically that recovery problems which are easier to solve from a statistical point of view (i.e. where more samples are available), are also easier to solve numerically. Here, we show that these two aspects are indeed intimately related. Here, as the first step, we study the exact recovery case and show that the null space property (NSP) introduced by (Cohen et al) can be seen as a measure of sharpness on the optimum of the sparse recovery problem. On one hand, this allows us to develop linearly convergent restart schemes whose rate depends on this sharpness. On the other hand, we recall how the NSP is linked to the recovery threshold of the sensing operator for random designs, thus producing a clear link between statistical and computational performance. We then analyze the underlying conic geometry of recovery problems. Robust recovery performance is controlled by a minimal conically restricted singular value. We recall Renegar’s condition number and show how it affects the computational complexity of optimality certificates for exact recovery and the linear convergence rate of restart schemes. By observing that the minimal conically restricted singular value matches the worst case value of Renegar’s condition number on sparse signals, we provide further evidence that a single quantity controls both computational and statistical aspects of recovery problems.
