Just Relax ❦ Convex Programming Methods for Subset Selection and Sparse Approximation Joel A. Tropp <jtropp@ices.utexas.edu> The University of Texas at Austin 1
Subset Selection ❦ ❧ Work in finite-dimensional inner-product space C d ❧ Let { ϕ ω : ω ∈ Ω } be a dictionary of unit-norm elementary signals ❧ Suppose s is an arbitrary input signal from C d ❧ Let τ be a fixed, positive threshold ❧ The subset selection problem is to solve 2 � � 2 + τ 2 � c � 0 � min � s − ω ∈ Ω c ω ϕ ω � � � c ∈ C Ω ❧ Problem arose in statistics more than 50 years ago ❧ Reference: [Miller 2002] Just Relax 2
Applications ❦ ❧ Linear regression ❧ Lossy compression of audio, images and video ❧ De-noising functions ❧ Detection and estimation of superimposed signals ❧ Regularization of linear inverse problems ❧ Approximation of functions by low-cost surrogates ❧ Sparse pre-conditioners for conjugate gradient solvers ❧ . . . Just Relax 3
Convex Relaxation ❦ Subset selection is combinatorial 2 � � 2 + τ 2 � c � 0 � min � s − ω ∈ Ω c ω ϕ ω � � � c ∈ C Ω ❧ References: [Natarajan 1995, Davis et al. 1997] Replace with a convex program 1 2 � � � min � s − ω ∈ Ω b ω ϕ ω 2 + γ � b � 1 � � 2 � b ∈ C Ω ❧ Can be solved in polynomial time with standard software ❧ Reference: [Chen et al. 1999] Just Relax 4
Why an ℓ 1 penalty? ❦ ℓ 0 quasi-norm ℓ 1 norm ℓ 2 norm Just Relax 5
Why an ℓ 1 penalty? ❦ ℓ 0 quasi-norm ℓ 1 norm ℓ 2 norm Just Relax 6
Why two different forms? ❦ Subset Selection 2 � � 2 + τ 2 � c � 0 � min � s − ω ∈ Ω c ω ϕ ω � � � c ∈ C Ω Convex Relaxation 1 2 � � � min � s − ω ∈ Ω b ω ϕ ω 2 + γ � b � 1 � � 2 � b ∈ C Ω Just Relax 7
Explanation, Part I ❦ ❧ If the dictionary is orthonormal, the ℓ 0 problem has an analytic solution ❧ Compute inner products between signal and dictionary c ω = � s , ϕ ω � ❧ Apply hard threshold operator with cutoff τ to each coefficient Just Relax 8
Explanation, Part II ❦ ❧ If the dictionary is orthonormal, the ℓ 1 problem has an analytic solution ❧ Compute inner products between signal and dictionary b ω = � s , ϕ ω � ❧ Apply soft threshold operator with cutoff γ to each coefficient Just Relax 9
The Coherence Parameter ❦ Insight: Subset selection is easy provided that the dictionary is nearly orthonormal. ❧ [Donoho–Huo 2001] introduces the coherence parameter def µ = max |� ϕ λ , ϕ ω �| λ � = ω ❧ Related to packing radius of dictionary, viewed as subset of P d − 1 ( C ) √ ❧ Possible to have | Ω | = d 2 and µ = 1 / d Just Relax 10
An Incoherent Dictionary ❦ 1 1/ √ d Impulses Complex Exponentials Just Relax 11
Result for Subset Selection ❦ Theorem A. Fix an input signal and a threshold τ . Suppose that ❧ c opt solves the subset selection problem with threshold τ ; 3 µ − 1 nonzero components; and ❧ c opt contains no more than 1 ❧ b ⋆ solves the convex relaxation with γ = 2 τ . Then it follows that ❧ c opt ( ω ) = 0 implies b ⋆ ( ω ) = 0 ; ❧ | b ⋆ ( ω ) − c opt ( ω ) | ≤ 3 τ for each ω ; ❧ in particular, b ⋆ ( ω ) � = 0 so long as | c opt ( ω ) | > 3 τ ; and ❧ the relaxation has a unique solution. Just Relax 12
Error-Constrained Sparse Approximation ❦ ❧ Suppose s is an arbitrary input signal from C d ❧ Let ε be a fixed, positive error tolerance ❧ The error-constrained sparse approximation problem is � � � min � c � 0 subject to � s − ω ∈ Ω c ω ϕ ω 2 ≤ ε � � � c ∈ C Ω ❧ Its convex relaxation is � � � min � b � 1 subject to � s − ω ∈ Ω b ω ϕ ω 2 ≤ δ � � � b ∈ C Ω Just Relax 13
Result for Sparse Approximation ❦ Fix an input signal, and let m ≤ 1 3 µ − 1 . Suppose that Theorem B. ❧ c opt solves the sparse approximation problem with tolerance ε ; ❧ c opt contains no more than m nonzero components; and ❧ b ⋆ solves the convex relaxation with tolerance δ = ε √ 1 + 6 m . Then it follows that ❧ c opt ( ω ) = 0 implies b ⋆ ( ω ) = 0 ; � ❧ � b ⋆ − c opt � 2 ≤ δ 3 / 2 ; and ❧ the relaxation has a unique solution. [Donoho et al. 2004] contains related results. Just Relax 14
For more information. . . ❦ Just Relax: Convex Programming Methods for Subset Selection and Sparse Approximation Available from <http://www.ices.utexas.edu/~jtropp/> or write to <jtropp@ices.utexas.edu> Other Work. . . ❧ Greedy and iterative algorithms for sparse approximation ❧ Other types of sparse approximation ❧ Construction of packings in Grassmannian manifolds ❧ Matrix nearness and inverse eigenvalue problems Just Relax 15
Recommend
More recommend