𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Equivalence principle for optimization of sparse versus low-spread representations for signal estimation in noise

✍ Scribed by Radu V. Balan; Justinian Rosca; Scott Rickard


Publisher
John Wiley and Sons
Year
2005
Tongue
English
Weight
248 KB
Volume
15
Category
Article
ISSN
0899-9457

No coin nor oath required. For personal study only.

✦ Synopsis


Abstract

Estimation of a sparse signal representation, one with the minimum number of nonzero components, is hard. In this paper, we show that for a nontrivial set of the input data the corresponding optimization problem is equivalent to and can be solved by an algorithm devised for a simpler optimization problem. The simpler optimization problem corresponds to estimation of signals under a low‐spread constraint. The goal of the two optimization problems is to minimize the Euclidian norm of the linear approximation error with an l^p^ penalty on the coefficients, for p = 0 (sparse) and p = 1 (low‐spread), respectively. The l^0^ problem is hard, whereas the l^1^ problem can be solved efficiently by an iterative algorithm. Here we precisely define the l^0^ optimization problem, construct an associated l^1^ optimization problem, and show that for a set with open interior of the input data the optimizers of the two optimization problems have the same support. The associated l^1^ optimization problem is used to find the support of the l^0^ optimizer. Once the support of the l^0^ problem is known, the actual solution is easily found by solving a linear system of equations. However, we point out our approach does not solve the harder optimization problem for all input data and thus may fail to produce the optimal solution in some cases. © 2005 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 15, 10–17, 2005; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.20034