the sparsity gap
play

The Sparsity Gap Joel A. Tropp Computing & Mathematical - PowerPoint PPT Presentation

The Sparsity Gap Joel A. Tropp Computing & Mathematical Sciences California Institute of Technology jtropp@acm.caltech.edu Research supported in part by ONR 1 Introduction The Sparsity Gap (Casazza Birthday Conference, College Park,


  1. The Sparsity Gap ❦ Joel A. Tropp Computing & Mathematical Sciences California Institute of Technology jtropp@acm.caltech.edu Research supported in part by ONR 1

  2. Introduction The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 2

  3. Systems of Linear Equations We consider linear systems of the form           m Φ            = x  b          � �� �   N Assume that ❧ Φ has dimensions m × N with N > m ❧ Φ has full row rank ❧ The columns of Φ have unit ℓ 2 norm The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 3

  4. The Trichotomy Theorem Theorem 1. For a linear system Φ x = b , exactly one of the following situations occurs. 1. No solution exists. 2. The equation has a unique solution. 3. The solutions form a linear subspace of positive dimension. The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 4

  5. Regularization via Sparsity A principled approach to underdetermined systems: min � x � 0 subject to Φ x = b (P0) where � x � 0 = # supp( x ) = # { j : x j � = 0 } ❧ When � x � 0 ≤ s , then x is called s -sparse ❧ If Φ x = b and x is s -sparse, then x is an s -sparse representation of b ❧ Since Φ has full row-rank, every b has an m -sparse representation ❧ Question: What can we say about sparser representations? The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 5

  6. Geometry The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 6

  7. Key Insight Sparse representations are well behaved when the matrix Φ is sufficiently nice (Column submatrices should not be singular and individual columns should not look like sparse signals) The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 7

  8. Quantifying Niceness I ❧ We call Φ a tight frame when N ΦΦ ∗ m · I = ❧ Equivalently, the rows of Φ form an orthonormal family (up to scaling) ❧ Observe that N/m is the redundancy of the frame ❧ Tight frames have minimal spectral norm among conformal matrices The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 8

  9. Quantifying Niceness II ❧ The coherence of Φ is the quantity µ = max j � = k |� ϕ j , ϕ k �| ❧ Measures the angle between columns ❧ When N ≥ 2 m , the coherence satisfies 1 √ m µ � ❧ Incoherent matrices appear often in signal processing applications References: [Welch 1974, Mallat–Zhang 1993, Donoho–Huo 2001, Gribonval–Nielsen 2003, Strohmer–Heath 2003] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 9

  10. Example: Identity + Fourier 1 Impulses Complex Exponentials A very incoherent tight frame The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 10

  11. Uniqueness The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 11

  12. Uncertainty implies Uniqueness Theorem 2. Suppose that a vector b has two representations: Φ x = b = Φ y . Then � x � 0 + � y � 0 > µ − 1 . Corollary 3. Suppose that b = Φ x where � x � 0 < 1 2 · µ − 1 . Then x is the unique solution to (P0) . ❧ Very strict requirement since µ − 1 � √ m References: [Donoho–Stark 1989, Donoho–Huo 2001, Gribonval–Nielsen 2003, Donoho–Elad 2003] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 12

  13. The Square-Root Threshold ❧ Sparse representations are not necessarily unique past the √ m threshold Example: The Dirac Comb ❧ Consider the Identity + Fourier matrix with m = p 2 ❧ There is a vector b that can be written as either p spikes or p sines ❧ By the Poisson summation formula, p − 1 p − 1 1 � � e − 2 π i pjt/m √ m b ( t ) = δ pj ( t ) = for t = 0 , 1 , . . . , m j =0 j =0 References: [Donoho–Stark 1989] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 13

  14. Enter Probability Insight: The bad vectors are atypical The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 14

  15. An Uncertainty Principle for Generic Signals Theorem 4. [T 2010] Suppose that b = Φ x where the nonzero components of x have a continuous distribution. With probability one, the vector b has no representation b = Φ y where supp( x ) ∩ supp( y ) = ∅ unless � x � 0 + � y � 0 > µ − 1 � x � 1 / 2 . 0 When µ ≤ m − 1 / 2 , condition becomes Corollary 5. �� m � � y � 0 > � x � 0 − 1 � x � 0 ❧ Even with refinements, this approach does not yield uniqueness! ❧ Problem: Some supports could be bad References: [ The Sparsity Gap ] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 15

  16. Enter More Probability Insight: The bad supports are atypical The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 16

  17. A Simple Model for Random Sparse Vectors Model (M0) for b = Φ x is a unit-norm tight frame of size m × N The matrix Φ with coherence µ ≤ c / log N . has cardinality s ≤ c m/ log N and The support of x is uniformly random. The nonzero entries of x have a continuous distribution. The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 17

  18. Random Submatrices & Sparse Representation Theorem 6. [T 2006, 2008] Assume all parameters satisfy Model (M0) . Draw a uniformly random set S of s columns from Φ , and define the random column submatrix A = Φ S . Then � � � A ∗ A − I � < 1 Prob ≥ 99 . 72% 2 and � � ∈ S � A ∗ ϕ n � 2 < 1 Prob max n/ ≥ 99 . 72% . 2 References: [ Random Subdictionaries , Random Submatrices ] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 18

  19. The Sparsity Gap Theorem 7. [T 2008, 2010] Let b = Φ x be a vector drawn from Model (M0) . With probability at least 99.44%, the following statements hold. 1. The vector x is the unique solution to (P0) . 2. Furthermore, there is no disjoint representation b = Φ y where supp( x ) ∩ supp( y ) = ∅ unless � � 1 + 2 · m � y � 0 > � x � 0 . N References: [Cand` es–Romberg 2006, Random Subdictionaries , Random Submatrices , The Sparsity Gap ] The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 19

  20. To learn more... Web: http://www.acm.caltech.edu/~jtropp E-mail: jtropp@acm.caltech.edu Relevant papers: ❧ “Conditioning of random subdictionaries,” ACHA , 2008 ❧ “Norms of random submatrices,” CRAS , 2008 ❧ “Spikes and sines,” JFAA , 2008 ❧ “The sparsity gap,” Proc. CISS , 2010 The Sparsity Gap (Casazza Birthday Conference, College Park, 20 May 2010) 20

Recommend


More recommend