support recovery for orthogonal matching pursuit
play

Support Recovery for Orthogonal Matching Pursuit: Upper and Lower - PowerPoint PPT Presentation

Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds Raghav Somani 1 , Chirag Gupta 2 , Prateek Jain 1 & Praneeth Netrapalli 1 1 Microsoft Research Lab - India 2 Machine Learning Department, Carnegie Mellon University


  1. Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds Raghav Somani 1 , Chirag Gupta 2 , Prateek Jain 1 & Praneeth Netrapalli 1 1 Microsoft Research Lab - India 2 Machine Learning Department, Carnegie Mellon University December 6, 2018 R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 1 / 7

  2. Sparse Linear Regression (SLR) Sparse Linear Regression (SLR) 2 A y x x ¯ = arg min ‖ 0 s ∗ ‖ x ≤ 2 Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :- Standard optimization : Condition number smoothness κ = strong convexity Sparse optimization : Restricted Condition number restricted smoothness ˜ κ = restricted strong convexity R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

  3. Sparse Linear Regression (SLR) Sparse Linear Regression (SLR) 2 A y x x ¯ = arg min ‖ 0 s ∗ ‖ x ≤ 2 Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :- Standard optimization : Condition number smoothness κ = strong convexity Sparse optimization : Restricted Condition number restricted smoothness ˜ κ = restricted strong convexity R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

  4. Sparse Linear Regression (SLR) Sparse Linear Regression (SLR) 2 A y x x ¯ = arg min ‖ 0 s ∗ ‖ x ≤ 2 Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :- Standard optimization : Condition number smoothness κ = strong convexity Sparse optimization : Restricted Condition number restricted smoothness ˜ κ = restricted strong convexity R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

  5. Sparse Linear Regression (SLR) Sparse Linear Regression (SLR) 2 A y x x ¯ = arg min ‖ 0 s ∗ ‖ x ≤ 2 Unconditionally, NP hard. Tractable under the assumption of Restricted Strong Convexity (RSC). Fundamental quantity capturing hardness :- Standard optimization : Condition number smoothness κ = strong convexity Sparse optimization : Restricted Condition number restricted smoothness ˜ κ = restricted strong convexity R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 2 / 7

  6. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  7. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  8. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  9. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  10. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  11. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  12. Sparse Linear Regression (SLR) Setup and Goals We work under the model where Observations Measurement matrix y = A ¯ x + η s ∗ -sparse vector Noise Goals of SLR Bounding Generalization error/Excess Risk - G ( x ) ∶= 1 x )∥ 2 n ∥ A ( x − ¯ 2 . 1 Support Recovery - Recover the support of ¯ x . 2 We study SLR under RSC assumption for OMP . R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 3 / 7

  13. Orthogonal Matching Pursuit Orthogonal Matching Pursuit x ̂ x ̂ k +1 k Incremental Greedy algorithm Popular and easy to implement Widely studied in literature Selected greedily ( k + 1) th k th iteration iteration R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 4 / 7

  14. Results Known results and our contribution Known results and our contribution Upper bound Lower bound 1 1 nσ 2 s ∗ ̃ κ 2 nσ 2 s ∗ ̃ Known Generalization bound ∝ κ 1 1 nσ 2 s ∗ ̃ nσ 2 s ∗ ̃ Our Generalization bound ∝ κ log ̃ κ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion κ 2 Known ∝ s ∗ ̃ Our’s ∝ s ∗ ̃ κ log ̃ κ R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

  15. Results Known results and our contribution Known results and our contribution Upper bound Lower bound 1 1 nσ 2 s ∗ ̃ κ 2 nσ 2 s ∗ ̃ Known Generalization bound ∝ κ 1 1 nσ 2 s ∗ ̃ nσ 2 s ∗ ̃ Our Generalization bound ∝ κ log ̃ κ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion κ 2 Known ∝ s ∗ ̃ Our’s ∝ s ∗ ̃ κ log ̃ κ R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

  16. Results Known results and our contribution Known results and our contribution Upper bound Lower bound 1 1 nσ 2 s ∗ ̃ κ 2 nσ 2 s ∗ ̃ Known Generalization bound ∝ κ 1 1 nσ 2 s ∗ ̃ nσ 2 s ∗ ̃ Our Generalization bound ∝ κ log ̃ κ κ Unconditional lower bounds for OMP . Support recovery guarantees and its lower bounds. Support Expansion κ 2 Known ∝ s ∗ ̃ Our’s ∝ s ∗ ̃ κ log ̃ κ R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 5 / 7

  17. Results Key ideas A key idea f ( x ) = ∥ Ax − y ∥ 2 2 If any support is unrecovered, then there is a large additive decrease . f ( x ) ≥ 0 � ⇒ support recovery will happen soon. Recovery with small support � ⇒ small generalization error. R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

  18. Results Key ideas A key idea f ( x ) = ∥ Ax − y ∥ 2 2 If any support is unrecovered, then there is a large additive decrease . f ( x ) ≥ 0 � ⇒ support recovery will happen soon. Recovery with small support � ⇒ small generalization error. R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

  19. Results Key ideas A key idea f ( x ) = ∥ Ax − y ∥ 2 2 If any support is unrecovered, then there is a large additive decrease . f ( x ) ≥ 0 � ⇒ support recovery will happen soon. Recovery with small support � ⇒ small generalization error. R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

  20. Results Key ideas A key idea f ( x ) = ∥ Ax − y ∥ 2 2 If any support is unrecovered, then there is a large additive decrease . f ( x ) ≥ 0 � ⇒ support recovery will happen soon. Recovery with small support � ⇒ small generalization error. R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 6 / 7

  21. Results Thank You Thank You! R. Somani, C. Gupta, P . Jain & P . Netrapalli Support Recovery bounds of OMP December 6, 2018 7 / 7

Recommend


More recommend