Search Machine Learning Repository: Estimating Unknown Sparsity in Compressed Sensing
Authors: Miles Lopes
Conference: Proceedings of the 30th International Conference on Machine Learning (ICML-13)
Year: 2013
Pages: 217-225
Abstract: In the theory of compressed sensing (CS), the sparsity $\|x\|_0$ of the unknown signal $x\in\R^p$ is commonly assumed to be a known parameter. However, it is typically unknown in practice. Due to the fact that many aspects of CS depend on knowing $\|x\|_0$, it is important to estimate this parameter in a data-driven way. A second practical concern is that $\|x\|_0$ is a highly unstable function of $x$. In particular, for real signals with entries not exactly equal to 0, the value $\|x\|_0=p$ is not a useful description of the effective number of coordinates. In this paper, we propose to estimate a stable measure of sparsity $s(x):=\|x\|_1^2/\|x\|_2^2$, which is a sharp lower bound on $\|x\|_0$. Our estimation procedure uses only a small number of linear measurements, does not rely on any sparsity assumptions, and requires very little computation. A confidence interval for $s(x)$ is provided, and its width is shown to have no dependence on the signal dimension $p$. Moreover, this result extends naturally to the matrix recovery setting, where a soft version of matrix rank can be estimated with analogous guarantees. Finally, we show that the use of randomized measurements is essential to estimating $s(x)$. This is accomplished by proving that the minimax risk for estimating $s(x)$ with deterministic measurements is large when $n\ll p$.
[pdf] [BibTeX]

authors venues years
Suggest Changes to this paper.
Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).