promp pre projected orthogonal matching pursuit
play

PROMP PRe-projected Orthogonal Matching Pursuit Axel Flinth Winter - PowerPoint PPT Presentation

PROMP PRe-projected Orthogonal Matching Pursuit Axel Flinth Winter School on Compressed Sensing Technische Universit at Berlin 5. 12. 2015 Axel Flinth PROMP WiCoS 2015 1 / 17 Orthogonal Matching Pursuit Algorithm OMP: Initalize r = b,


  1. PROMP PRe-projected Orthogonal Matching Pursuit Axel Flinth Winter School on Compressed Sensing Technische Universit¨ at Berlin 5. 12. 2015 Axel Flinth PROMP WiCoS 2015 1 / 17

  2. Orthogonal Matching Pursuit Algorithm OMP: Initalize r = b, S = ∅ , then iteratively i ∗ = argmax i |� a i , r �| S ← S ∪ { i ∗ } , r ← Π ran A ⊥ S ∪{ i ∗} r . Axel Flinth PROMP WiCoS 2015 2 / 17

  3. Orthogonal Matching Pursuit Algorithm OMP: Initalize r = b, S = ∅ , then iteratively i ∗ = argmax i |� a i , r �| S ← S ∪ { i ∗ } , r ← Π ran A ⊥ S ∪{ i ∗} r . Very easy to implement, very fast for small sparsities. Axel Flinth PROMP WiCoS 2015 2 / 17

  4. Orthogonal Matching Pursuit Algorithm OMP: Initalize r = b, S = ∅ , then iteratively i ∗ = argmax i |� a i , r �| S ← S ∪ { i ∗ } , r ← Π ran A ⊥ S ∪{ i ∗} r . Very easy to implement, very fast for small sparsities. Lower recovery probabilities than e.g. Basis Pursuit. Also, for moderate sparsities, OMP gets slow. Axel Flinth PROMP WiCoS 2015 2 / 17

  5. Orthogonal Matching Pursuit Algorithm OMP: Initalize r = b, S = ∅ , then iteratively i ∗ = argmax i |� a i , r �| S ← S ∪ { i ∗ } , r ← Π ran A ⊥ S ∪{ i ∗} r . Very easy to implement, very fast for small sparsities. Lower recovery probabilities than e.g. Basis Pursuit. Also, for moderate sparsities, OMP gets slow. Additionally x 0 ∈ Z d ❀ Modification enables higher speed/ larger recovery probability? Axel Flinth PROMP WiCoS 2015 2 / 17

  6. Why is x 0 ∈ Z d + Sparse Interesting? 0011010011010 User transmits bit-sequence ❀ x 0 ∈ Z d . Axel Flinth PROMP WiCoS 2015 3 / 17

  7. Why is x 0 ∈ Z d + Sparse Interesting? 0011010011010 User transmits bit-sequence ❀ x 0 ∈ Z d . Random scattering Axel Flinth PROMP WiCoS 2015 3 / 17

  8. Why is x 0 ∈ Z d + Sparse Interesting? 0011010011010 User transmits bit-sequence ❀ x 0 ∈ Z d . Random scattering ❀ b = Ax 0 , A Gaussian. Axel Flinth PROMP WiCoS 2015 3 / 17

  9. Why is x 0 ∈ Z d + Sparse Interesting? 0111010010101 1010111100101 0011010011010 1101001101100 User transmits bit-sequence ❀ x 0 ∈ Z d . Random scattering ❀ b = Ax 0 , A Gaussian. At each moment, few users are transmitting ❀ x 0 sparse. Axel Flinth PROMP WiCoS 2015 3 / 17

  10. Let us go Back to the Roots. x = A + b ( A + Moore-Penrose Classical solution to inverse problem Ax = b : ˆ Inverse) Corresponds to ℓ 2 -minimization min � x � 2 subject to Ax = b . Axel Flinth PROMP WiCoS 2015 4 / 17

  11. Let us go Back to the Roots. x = A + b ( A + Moore-Penrose Classical solution to inverse problem Ax = b : ˆ Inverse) Corresponds to ℓ 2 -minimization min � x � 2 subject to Ax = b . Fast, easy to implement and to analyze. Axel Flinth PROMP WiCoS 2015 4 / 17

  12. Let us go Back to the Roots. x = A + b ( A + Moore-Penrose Classical solution to inverse problem Ax = b : ˆ Inverse) Corresponds to ℓ 2 -minimization min � x � 2 subject to Ax = b . Fast, easy to implement and to analyze. But: Bad for sparse recovery. Axel Flinth PROMP WiCoS 2015 4 / 17

  13. Let us go Back to the Roots. x = A + b ( A + Moore-Penrose Classical solution to inverse problem Ax = b : ˆ Inverse) Corresponds to ℓ 2 -minimization min � x � 2 subject to Ax = b . Fast, easy to implement and to analyze. But: Bad for sparse recovery. If ˆ x ( i ) large, x 0 ( i ) large seems more probable . . . Axel Flinth PROMP WiCoS 2015 4 / 17

  14. PROMP i | | x ∗ ( i ) | ≥ ϑ m � � S ϑ = d Last slide suggests S ϑ ≈ supp x 0 . Axel Flinth PROMP WiCoS 2015 5 / 17

  15. PROMP i | | x ∗ ( i ) | ≥ ϑ m � � S ϑ = d Last slide suggests S ϑ ≈ supp x 0 . Therefore, the following modification of OMP seems reasonable. Algorithm PROMP: Initalize r =Π ran A ⊥ S ϑ b, S = S ϑ , then iteratively i ∗ = argmax |� a i , r �| I ← I ∪ { i ∗ } , r ← Π ran A ⊥ S ∪{ i ∗} r . Axel Flinth PROMP WiCoS 2015 5 / 17

  16. Aspects That we Have to Analyze Is really S ϑ ≈ supp x 0 ? How does initialization affect OMP ? Axel Flinth PROMP WiCoS 2015 6 / 17

  17. Support Approximation Is really S ϑ ≈ supp x 0 ? x ( i ) | ≥ m � � � | ˆ � S ϑ = i d ϑ . Axel Flinth PROMP WiCoS 2015 7 / 17

  18. Support Approximation Is really S ϑ ≈ supp x 0 ? x ( i ) | ≥ m � � � | ˆ � S ϑ = i d ϑ . Theorem (F., 2015) Let x 0 ∈ Z d , A ∈ R m , d be Gaussian and η > 0 arbitrary. Then there exists a threshold m ∗ ( x 0 , d , η ) so that if m ≥ m ∗ ( x 0 , d , η ) , we have i ∈ supp x 0 ⇒ P ( i ∈ S ϑ ) ≥ 1 − η . ∈ supp x 0 ⇒ P ( i / ∈ S ϑ ) ≥ 1 − η . i / I.e. m ≥ m ∗ , S ϑ is a good approximation of supp x 0 . Axel Flinth PROMP WiCoS 2015 7 / 17

  19. Support Approximation Is really S ϑ ≈ supp x 0 ? x ( i ) | ≥ m � � � | ˆ � S ϑ = i d ϑ . Theorem (F., 2015) Let x 0 ∈ Z d , A ∈ R m , d be Gaussian and η > 0 arbitrary. Then there exists a threshold m ∗ ( x 0 , d , η ) so that if m ≥ m ∗ ( x 0 , d , η ) , we have i ∈ supp x 0 ⇒ P ( i ∈ S ϑ ) ≥ 1 − η . ∈ supp x 0 ⇒ P ( i / ∈ S ϑ ) ≥ 1 − η . i / I.e. m ≥ m ∗ , S ϑ is a good approximation of supp x 0 . Proof sketch: Solution of ℓ 2 -minimization is given by ˆ x = Π ker A ⊥ x 0 . Axel Flinth PROMP WiCoS 2015 7 / 17

  20. Proof Sketch Since A is Gaussian, ker A ⊥ ∼ U ( G ( d , m )). ˆ x ( i ) = � Π L x 0 , e i � , L ∼ U ( G ( d , m )) . Axel Flinth PROMP WiCoS 2015 8 / 17

  21. Proof Sketch Since A is Gaussian, ker A ⊥ ∼ U ( G ( d , m )). ˆ x ( i ) = � Π L x 0 , e i � , L ∼ U ( G ( d , m )) . L → � Π L x 0 , e i � is Lipschitz. Axel Flinth PROMP WiCoS 2015 8 / 17

  22. Proof Sketch Since A is Gaussian, ker A ⊥ ∼ U ( G ( d , m )). ˆ x ( i ) = � Π L x 0 , e i � , L ∼ U ( G ( d , m )) . L → � Π L x 0 , e i � is Lipschitz. Measure Concentration: If X ∼ U ( G ( d , m )) and f is Lipschitz, then, with high probability, f ( X ) ≈ E ( f ( X )) . m larger ↔ Sharper concentration. Axel Flinth PROMP WiCoS 2015 8 / 17

  23. Proof Sketch We need to calculate E ( � Π L x 0 , e i � ). Axel Flinth PROMP WiCoS 2015 9 / 17

  24. Proof Sketch We need to calculate E ( � Π L x 0 , e i � ). Lemma: We have � Π L x 0 ∼ R 2 x 0 + R 1 − R 2 Q x 0 [ θ, 0] , � �� � x 0 S d − 2 � where R ∼ 2 , θ ∼ U � � Π L � � � x 0 � 2 � independent of R and Q x 0 fixed with Q x 0 e d = x 0 . Axel Flinth PROMP WiCoS 2015 9 / 17

  25. Proof Sketch We need to calculate E ( � Π L x 0 , e i � ). Lemma: We have � Π L x 0 ∼ R 2 x 0 + R 1 − R 2 Q x 0 [ θ, 0] , � �� � x 0 S d − 2 � where R ∼ 2 , θ ∼ U � � Π L � � � x 0 � 2 � independent of R and Q x 0 fixed with Q x 0 e d = x 0 . Therefore �� � �� � R 2 x 0 , e i E ( � Π L x 0 , e i � ) = E � 1 − R 2 Q x 0 [ θ, 0] , e i + R = m � � � R 2 � [ θ, 0] , Q ∗ � 1 − R 2 �� �� = E x 0 ( i ) + E R E x 0 e i d x 0 ( i ) . Axel Flinth PROMP WiCoS 2015 9 / 17

  26. Proof Sketch We need to calculate E ( � Π L x 0 , e i � ). Lemma: We have � Π L x 0 ∼ R 2 x 0 + R 1 − R 2 Q x 0 [ θ, 0] , � �� � x 0 S d − 2 � where R ∼ 2 , θ ∼ U � � Π L � � � x 0 � 2 � independent of R and Q x 0 fixed with Q x 0 e d = x 0 . Therefore �� � �� � R 2 x 0 , e i E ( � Π L x 0 , e i � ) = E � 1 − R 2 Q x 0 [ θ, 0] , e i + R = m � � � R 2 � [ θ, 0] , Q ∗ � 1 − R 2 �� �� = E x 0 ( i ) + E R E x 0 e i d x 0 ( i ) . I.e. for i ∈ supp x 0 ( / ∈ supp x 0 ), | ˆ x ( i ) | will probably larger (smaller) than m d ϑ | x 0 ( i ) | ϑ ≥ m d ϑ . Axel Flinth PROMP WiCoS 2015 9 / 17

  27. Proof Sketch We need to calculate E ( � Π L x 0 , e i � ). Lemma: We have � Π L x 0 ∼ R 2 x 0 + R 1 − R 2 Q x 0 [ θ, 0] , � �� � x 0 S d − 2 � where R ∼ 2 , θ ∼ U � � Π L � � � x 0 � 2 � independent of R and Q x 0 fixed with Q x 0 e d = x 0 . Therefore �� � �� � R 2 x 0 , e i E ( � Π L x 0 , e i � ) = E � 1 − R 2 Q x 0 [ θ, 0] , e i + R = m � � � R 2 � [ θ, 0] , Q ∗ � 1 − R 2 �� �� = E x 0 ( i ) + E R E x 0 e i d x 0 ( i ) . I.e. for i ∈ supp x 0 ( / ∈ supp x 0 ), | ˆ x ( i ) | will probably larger (smaller) than m d ϑ | x 0 ( i ) | ϑ ≥ m d ϑ . � . Axel Flinth PROMP WiCoS 2015 9 / 17

  28. Aspects That we Have to Analyze Is really S ϑ ≈ supp x 0 ? � How does initialization affect OMP ? Axel Flinth PROMP WiCoS 2015 10 / 17

  29. Initialization Effect Classical condition for OMP to reconstruct s -sparse x 0 : Coherence of A : µ ( A ) = max i � = j |� a i , a j �| , ( A has normalized columns). If µ ( A )(2 s − 1) < 1, OMP will recover s -sparse signals. Axel Flinth PROMP WiCoS 2015 11 / 17

  30. Initialization Effect Classical condition for OMP to reconstruct s -sparse x 0 : Coherence of A : µ ( A ) = max i � = j |� a i , a j �| , ( A has normalized columns). If µ ( A )(2 s − 1) < 1, OMP will recover s -sparse signals. Define S -coherence of A : � � �� µ S ( A ) := max Π ran A ⊥ S a i , Π ran A ⊥ S a j � � � � i � = j Axel Flinth PROMP WiCoS 2015 11 / 17

Recommend


More recommend