introduction to compressed sensing
play

Introduction to Compressed Sensing Gitta Kutyniok (Institut f ur - PowerPoint PPT Presentation

Introduction to Compressed Sensing Gitta Kutyniok (Institut f ur Mathematik, Technische Universit at Berlin) Winter School on Compressed Sensing, TU Berlin December 35, 2015 Gitta Kutyniok (TU Berlin) Introduction to Compressed


  1. Introduction to Compressed Sensing Gitta Kutyniok (Institut f¨ ur Mathematik, Technische Universit¨ at Berlin) Winter School on “Compressed Sensing”, TU Berlin December 3–5, 2015 Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 1 / 40

  2. Outline Modern Data Processing 1 Data Deluge Information Content of Data Why do we need Compressed Sensing? Main Ideas of Compressed Sensing 2 Sparsity Measurement Matrices Recovery Algorithms Applications 3 This Winter School 4 Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 2 / 40

  3. The Age of Data Problem of the 21th Century: We live in a digitalized world. Slogan: “Big Data” . New technologies produce/sense enormous amounts of data. Problems: Storage, Transmission, and Analysis. “Big Data Research and Development Initiative” Barack Obama (March 2012) Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 3 / 40

  4. Olympic Games 2012 Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 4 / 40

  5. Better, Stronger, Faster! Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 5 / 40

  6. Accelerating Data Deluge Situation 2010: 1250 Billion Gigabytes generated in 2010: # digital bits > # stars in the universe Growing by a factor of 10 every 5 years. Available transmission bandwidth Observations: Total data generated > total storage Increases in generation rate >> increases in communication rate Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 6 / 40

  7. What can we do...? Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 7 / 40

  8. Quote by Einstein “Not everything that can be counted counts, and not everything that counts can be counted.” Albert Einstein Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 8 / 40

  9. An Applied Harmonic Analysis Viewpoint Exploit a carefully designed representation system ( ψ λ ) λ ∈ Λ ⊆ H : � H ∋ f − → ( � f , ψ λ � ) λ ∈ Λ − → � f , ψ λ � ψ λ = f . λ ∈ Λ Desiderata: Special features encoded in the “large” coefficients | � f , ψ λ � | . Efficient representations: � f ≈ � f , ψ λ � ψ λ , #(Λ N ) small λ ∈ Λ N Goals: Derive high compression by considering only the “large” coefficients. Modification of the coefficients according to the task. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 9 / 40

  10. Review of Wavelets for L 2 ( R 2 ) Definition (1D): Let φ ∈ L 2 ( R ) be a scaling function and ψ ∈ L 2 ( R ) be a wavelet. Then the associated wavelet system is defined by { φ ( x − m ) : m ∈ Z } ∪ { 2 j / 2 ψ (2 j x − m ) : j ≥ 0 , m ∈ Z } . Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

  11. Review of Wavelets for L 2 ( R 2 ) Definition (1D): Let φ ∈ L 2 ( R ) be a scaling function and ψ ∈ L 2 ( R ) be a wavelet. Then the associated wavelet system is defined by { φ ( x − m ) : m ∈ Z } ∪ { 2 j / 2 ψ (2 j x − m ) : j ≥ 0 , m ∈ Z } . Definition (2D): A wavelet system is defined by { φ (1) ( x − m ) : m ∈ Z 2 } ∪ { 2 j ψ ( i ) (2 j x − m ) : j ≥ 0 , m ∈ Z 2 , i = 1 , 2 , 3 } , where ψ (1) ( x ) = φ ( x 1 ) ψ ( x 2 ) , φ (1) ( x ) = φ ( x 1 ) φ ( x 2 ) ψ (2) ( x ) and = ψ ( x 1 ) φ ( x 2 ) , ψ (3) ( x ) = ψ ( x 1 ) ψ ( x 2 ) . Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

  12. The World is Compressible! k << N N pixels large wavelet coefficients N wideband k << N signal large Gabor samples coefficients Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 11 / 40

  13. JPEG2000 Kompression auf 1/20 Kompression auf 1/200 Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 12 / 40

  14. The New Paradigm for Data Processing: Sparsity! Sparse Signals: A signal x ∈ R N is k -sparse, if � x � 0 = #non-zero coefficients ≤ k . � Model Σ k : Union of k -dimensional subspaces. Compressible Signals: A signal x ∈ R N is compressible, if the sorted coefficients have rapid (power law) decay. � Model: ℓ p ball with p ≤ 1. | x i | k N Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 13 / 40

  15. “Not everything that can be counted counts...” (Einstein) Classical Approach: N k N x ˆ x Sensing/Sampling Compression Reconstruction Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

  16. “Not everything that can be counted counts...” (Einstein) Classical Approach: N k N x ˆ x Sensing/Sampling Compression Reconstruction Sensing/Sampling: ◮ Linear processing. Compression: ◮ Non-linear processing. � Why acquire N samples only to discard all but k pieces of data? Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

  17. “Not everything that can be counted counts...” (Einstein) Classical Approach: N k N x ˆ x Sensing/Sampling Compression Reconstruction Sensing/Sampling: ◮ Linear processing. Compression: ◮ Non-linear processing. � Why acquire N samples only to discard all but k pieces of data? Fundamental Idea: Directly acquire “compressed data”, i.e., the information content. Take more universal measurements: N ( k < ) n ( << N ) x x ˆ Compressed Sensing Reconstruction Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

  18. Compressed Sensing enters the Stage ‘Initial’ Papers: E. Cand` es, J. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements , Comm. Pure Appl. Math. 59 (2006), 1207–1223. D. Donoho, Compressed sensing , IEEE Trans. Inform. Theory 52 (2006), 1289–1306. Avalanche of Results ( dsp.rice.edu/cs ): Approx. 2000 papers and 150 conferences so far. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

  19. Compressed Sensing enters the Stage ‘Initial’ Papers: E. Cand` es, J. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements , Comm. Pure Appl. Math. 59 (2006), 1207–1223. D. Donoho, Compressed sensing , IEEE Trans. Inform. Theory 52 (2006), 1289–1306. Avalanche of Results ( dsp.rice.edu/cs ): Approx. 2000 papers and 150 conferences so far. Relation to the following areas: Applied harmonic analysis. Applied linear algebra. Convex optimization. Geometric functional analysis. Random matrix theory. Application areas: Radar, Astronomy, Biology, Seismology, Signal processing and more. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

  20. What is Compressed Sensing...? Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 16 / 40

  21. Compressed Sensing Problem, I General Procedure: Signal x ∈ R N . x is k -sparse. Take n << N linear, non-adaptive measurements using a matrix A . y = A x Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

  22. Compressed Sensing Problem, I General Procedure: Signal x ∈ R N . x is k -sparse. Take n << N linear, non-adaptive measurements using a matrix A . y = A x Viewpoints: Efficient sampling. Dimension reduction. Efficient representation. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

  23. Compressed Sensing Problem, II y = A x Fundamental Questions: What are suitable signal models? When and with which accuracy can the signal be recovered? What are suitable sensing matrices? How can the signal be algorithmically recovered? Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 18 / 40

  24. Fundamental Theorem of Sparse Solutions Definition: Let A be an n × N matrix. Then spark( A ) denotes the minimal number of linearly dependent columns; spark( A ) ∈ [2 , n + 1]. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

  25. Fundamental Theorem of Sparse Solutions Definition: Let A be an n × N matrix. Then spark( A ) denotes the minimal number of linearly dependent columns; spark( A ) ∈ [2 , n + 1]. Lemma: Let A be an n × N matrix, and let k ∈ N . Then the following conditions are equivalent: (i) For every y ∈ R n , there exists at most one x ∈ R N with � x � 0 ≤ k such that y = Ax . (ii) k < spark( A ) / 2. Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

  26. Fundamental Theorem of Sparse Solutions Definition: Let A be an n × N matrix. Then spark( A ) denotes the minimal number of linearly dependent columns; spark( A ) ∈ [2 , n + 1]. Lemma: Let A be an n × N matrix, and let k ∈ N . Then the following conditions are equivalent: (i) For every y ∈ R n , there exists at most one x ∈ R N with � x � 0 ≤ k such that y = Ax . (ii) k < spark( A ) / 2. Sketch of Proof: Assume y = Ax 0 = Ax 1 . Then x 0 − x 1 ∈ N ( A ). Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

Recommend


More recommend