robust uncertainty principles exact signal reconstruction
play

Robust Uncertainty Principles: Exact Signal Reconstruction from - PowerPoint PPT Presentation

1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Cand` es, California Institute of Technology SIAM Conference on Imaging Science, Salt Lake City, Utah, May 2004 Collaborators :


  1. 1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Cand` es, California Institute of Technology SIAM Conference on Imaging Science, Salt Lake City, Utah, May 2004 Collaborators : Justin Romberg (Caltech), Terence Tao (UCLA)

  2. 2 Incomplete Fourier Information Observe Fourier samples ˆ f ( ω ) on a domain Ω . 22 radial lines, ≈ 8% coverage

  3. 3 Classical Reconstruction Backprojection: essentially reconstruct g ∗ with  ˆ f ( ω ) ω ∈ Ω  g ∗ ( ω ) = ˆ 0 ω �∈ Ω  Original Phantom (Logan − Shepp) Naive Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ original

  4. 4 Interpolation? A Row of the Fourier Matrix 25 20 15 10 5 0 − 5 − 10 − 15 − 20 − 25 0 50 100 150 200 250 300 g ∗ original

  5. 5 Total Variation Reconstruction Reconstruct g ∗ with g ( ω ) = ˆ min � g � T V s.t. ˆ f ( ω ) , ω ∈ Ω g Original Phantom (Logan − Shepp) Reconstruction: min BV + nonnegativity constraint 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = original — perfect reconstruction! original

  6. 6 Sparse Spike Train Sparse sequence of N T spikes Observe N Ω Fourier coefficients 5 4 3 2 1 0 − 1 − 2 − 3 − 4 − 5 0 20 40 60 80 100 120 140

  7. 7 Interpolation?

  8. 8 ℓ 1 Reconstruction Reconstruct by solving � g ( ω ) = ˆ min | g t | ˆ f ( ω ) , ω ∈ Ω s.t. g t For N T ∼ N Ω / 2 , we recover f perfectly. original recovered from 30 Fourier samples

  9. 9 Extension to TV � � g � T V = | g i +1 − g i | = ℓ 1 -norm of finite differences i Given frequency observations on Ω , using g ( ω ) = ˆ min � g � T V s.t. ˆ f ( ω ) , ω ∈ Ω we can perfectly reconstruct signals with a small number of jumps.

  10. 10 Reconstructed perfectly from 30 Fourier samples

  11. 11 Model Problem • Signal made out of T spikes • Observed at only | Ω | frequency locations • Extensions – Piecewise constant signal – Spikes in higher-dimensions; 2 D , 3 D , etc. – Piecewise constant images – Many others

  12. 12 Sharp Uncertainty Principles • Signal is sparse in time, only | T | spikes • Solve combinatorial optimization problem g | Ω = ˆ ( P 0 ) min � g � ℓ 0 := # { t, g ( t ) � = 0 } , ˆ f | Ω g Theorem 1 N (sample size) is prime (i) Assume that | T | ≤ | Ω | / 2 , then ( P 0 ) reconstructs exactly. (ii) Assume that | T | > | Ω | / 2 , then ( P 0 ) fails at exactly reconstructing f ; ∃ f 1 , f 2 with � f 1 � ℓ 0 + � f 2 � ℓ 0 = | Ω | + 1 and f 1 ( ω ) = ˆ ˆ f 2 ( ω ) , ∀ ω ∈ Ω

  13. 13 ℓ 1 Relaxation? Solve convex optimization problem (LP for real-valued signals) g | Ω = ˆ � ( P 1 ) min � g � ℓ 1 := | g ( t ) | , ˆ f | Ω g t • Example: Dirac’s comb √ N equispaced spikes ( N perfect square). – – Invariant through Fourier transform ˆ f = f √ N with ˆ – Can find | Ω | = N − f ( ω ) = 0 , ∀ ω ∈ Ω . – Can’t reconstruct • More dramatic examples exist • But all these examples are very special

  14. 14 Dirac’s Comb f(t) f( ω ) N N t ω ˆ f f

  15. 15 Main Result Theorem 2 Suppose | Ω | | T | ≤ α ( M ) · log N Then min- ℓ 1 reconstructs exactly with prob. greater than 1 − O ( N − M ) . (n.b. one can choose α ( M ) ∼ [29 . 6( M + 1)] − 1 . Extensions • | T | , number of jump discontinuities (TV reconstruction) • | T | , number of 2D, 3D spikes. • | T | , number of 2D jump discontinuities (2D TV reconstruction)

  16. 16 Heuristics: Robust Uncertainty Principles f unique minimizer of ( P 1 ) iff � � ∀ h, ˆ | f ( t ) + h ( t ) | > | f ( t ) | , h | Ω = 0 t t Triangle inequality � � � � � | f ( t )+ h ( t ) | = | f ( t )+ h ( t ) | + | h t | ≥ | f ( t ) |−| h ( t ) | + | h t | T T c T T c Sufficient condition | h ( t ) | ≤ 1 � � � | h ( t ) | ≤ | h ( t ) | ⇔ 2 � h � ℓ 1 T T c T Conclusion: f unique minimizer if for all h , s.t. ˆ h | Ω = 0 , it is impossible to ‘concentrate’ h on T

  17. 17 Connections: • Donoho & Stark (88) • Donoho & Huo (01) • Gribonval & Nielsen (03) • Tropp (03) and (04) • Donoho & Elad (03)

  18. 18 Dual Viewpoint • Convex problem has a dual • Dual polynomial � ˆ P ( ω ) e iωt P ( t ) = ω ∈ Ω – P ( t ) = sgn ( f )( t ) , ∀ t ∈ T – | P ( t ) | < 1 , ∀ t ∈ T c – ˆ P supported on set Ω of visible frequencies Theorem 3 (i) If F T → Ω and there exits a dual polynomial, then the ( P 1 ) minimizer ( P 1 ) is unique and is equal to f . (ii) Conversely, if f is the unique minimizer of ( P 1 ) , then there exists a dual polynomial.

  19. 19 Dual Polynomial ^ P( ω) P(t) ω t Space Frequency

  20. 20 Construction of the Dual Polynomial � ˆ P ( ω ) e iωt P ( t ) = ω ∈ Ω • P interpolates sgn ( f ) on T • P has minimum energy

  21. 21 Auxilary matrices e iω ( t − t ′ ) f ( t ′ ) . � � Hf ( t ) := − ω ∈ Ω t ′ ∈ E : t ′ � = t Restriction: • ι ∗ is the restriction map, ι ∗ f := f | T • ι is the obvious embedding obtained by extending by zero outside of T • Identity ι ∗ ι is simply the identity operator on T . 1 1 | Ω | H )( ι ∗ ι − | Ω | ι ∗ H ) − 1 ι ∗ sgn ( f ) . P := ( ι − • Frequency support. P has Fourier transform supported in Ω • Spatial interpolation. P obeys 1 1 ι ∗ P = ( ι ∗ ι − | Ω | ι ∗ T )( ι ∗ ι − | Ω | ι ∗ T ) − 1 ι ∗ sgn ( f ) = ι ∗ sgn ( f ) , and so P agrees with sgn ( f ) on T .

  22. 22 Hard Things 1 1 | Ω | H )( ι ∗ ι − | Ω | ι ∗ H ) − 1 ι ∗ sgn ( f ) . P := ( ι − 1 • ( ι ∗ ι − | Ω | ι ∗ H ) invertible • | P ( t ) | < 1 , t / ∈ T

  23. 23 Invertibility  t = t ′ ( ι ∗ ι − 1 | Ω | ι ∗ H ) = I T − 1 0  H 0 ( t, t ′ ) = | Ω | H 0 , ω ∈ Ω e iω ( t − t ′ ) . t � = t ′ − �  Fact: | H 0 ( t, t ′ ) | ∼ � | Ω | � H 0 � 2 ≤ Tr ( H ∗ | H 0 ( t, t ′ ) | 2 ∼ | T | 2 · | Ω | � 0 H 0 ) = t,t ′ Want � H 0 � ≤ | Ω | , and therefore | T | 2 · | Ω | = O ( | Ω | 2 ) � ⇔ | T | = O ( | Ω | )

  24. 24 Key Estimates • Want to show largest eigenvalue of H 0 (self-adjoint) is less than Ω . • Take large powers of random matrices Tr ( H 2 n 0 ) = λ 2 n + . . . + λ 2 n 1 T • Key estimate: develop bounds on E [ Tr ( H 2 n 0 )] • Key intermediate result: � � � H 0 � ≤ γ log | T | | T | | Ω | with large-probability • A lot of combinatorics!

  25. 25 Numerical Results • Signal length N = 1024 • Randomly place N t spikes, observe N w random frequencies • Measure % recovered perfectly • red = always recovered, blue = never recovered N w N t /N w

  26. 26 Other Phantoms, I Original Phantom Classical Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = classical reconstruction original

  27. 27 Original Phantom Total Variation Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = TV reconstruction = Exact! original

  28. 28 Other Phantoms, II Original Phantom Classical Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = classical reconstruction original

  29. 29 Original Phantom Total Variation Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = TV reconstruction = Exact! original

  30. 30 Scanlines A Scanline of the Original Phantom Classical (Black) and TV (Red) Reconstructions 2 2 1.8 1.8 1.6 1.6 1.4 1.4 1.2 1.2 1 1 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0 0 0 50 100 150 200 250 300 0 50 100 150 200 250 300 g ∗ = classical reconstruction original

  31. 31 Summary • Exact reconstruction • Tied to new uncertainty principles • Stability • Robustness • Optimality • Many extensions: e.g. arbitrary synthesis/measurement pairs Contact: emmanuel@acm.caltech.edu

Recommend


More recommend