generalized approximate survey propagation
play

Generalized Approximate Survey Propagation for Hig igh-dimensional - PowerPoint PPT Presentation

Generalized Approximate Survey Propagation for Hig igh-dimensional Estimation Luca Saglietti Yue Lu , Harvard University Carlo Lucibello , Bocconi University Outline Generalized Linear Models (GLM) Real-valued phase retrieval


  1. Generalized Approximate Survey Propagation for Hig igh-dimensional Estimation Luca Saglietti Yue Lu , Harvard University Carlo Lucibello , Bocconi University

  2. Outline • Generalized Linear Models (GLM) • Real-valued phase retrieval • Inference model • Approximate message-passing • Effective landscapes and competition • Breaking the replica symmetry • Changing the effective landscape • Conclusions

  3. Generalized Lin inear Models 3 ingredients : TRUE SIGNAL : OBSERVATION : MATRIX OBSERVED : SIGNAL High-dimensional limit: with of

  4. Generalized Lin inear Models 3 ingredients : TRUE SIGNAL : OBSERVATION : MATRIX OBSERVED : SIGNAL High-dimensional limit: with of

  5. Generalized Lin inear Models 3 ingredients : TRUE SIGNAL : OBSERVATION : MATRIX OBSERVED : SIGNAL High-dimensional limit: with of

  6. Generalized Lin inear Models 3 ingredients : TRUE SIGNAL : OBSERVATION : MATRIX OBSERVED : SIGNAL High-dimensional limit: with of

  7. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  8. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  9. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  10. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  11. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  12. An example: : Real-valued Phase Retrieval ( + noise ) Fun facts about phase retrieval: • Physically meaningful! • symmetry in the signal space. • should provide enough information for a perfect reconstruction. • Gradient descent struggles to reconstruct the signal until . • Rigorous result about convexification in a regime.

  13. In Inference Model GRAPHICAL MODEL Sensible choice: MATCHED / MISMATCHED Bayesian optimal: Estimator : Maximum a posteriori:

  14. In Inference Model GRAPHICAL MODEL Sensible choice: MATCHED / MISMATCHED Bayesian optimal: Estimator : Maximum a posteriori:

  15. In Inference Model GRAPHICAL MODEL Sensible choice: MATCHED / MISMATCHED Bayesian optimal: Estimator : Maximum a posteriori:

  16. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  17. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  18. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  19. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  20. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  21. Approximate Message-passing How do we obtain ? Easy (if everything is i.i.d.) Close on Gaussian single-site ansatz quantities BP rBP AMP (TAP) Encoding prior dependence DEFINE 2 SCALAR INFERENCE CHANNELS: Encoding data dependence When do we get a good estimator ?

  22. Effective Landscapes and Competition (possible scenario) energy overlap : GD in this effective landscape 1: Stationary points Fixed points ρ Low overlap High overlap IMPOSSIBLE (SNR)

  23. Effective Landscapes and Competition (possible scenario) energy overlap : GD in this effective landscape 2: Stationary points Fixed points ρ Low overlap High overlap IMPOSSIBLE (SNR)

  24. Effective Landscapes and Competition (possible scenario) energy overlap : GD in this effective landscape 3: Stationary points Fixed points ρ Low overlap High overlap IMPOSSIBLE HARD (SNR)

  25. Effective Landscapes and Competition (possible scenario) energy overlap : GD in this effective landscape 4: Stationary points Fixed points ρ Low overlap High overlap IMPOSSIBLE HARD EASY (SNR)

  26. Breaking the symmetry ry vs GAMP GASP(s) Replica symmetry 1RSB assumption assumption Input scalar channel: Input scalar channel: SYMMETRY BREAKING PARAMETER • Same computational complexity • (Potentially) more expensive element-wise operations • How to set the symmetry breaking parameter s ?

  27. Breaking the symmetry ry vs GAMP GASP(s) Replica symmetry 1RSB assumption assumption Input scalar channel: Input scalar channel: SYMMETRY BREAKING PARAMETER • Same computational complexity • (Potentially) more expensive element-wise operations • How to set the symmetry breaking parameter s ?

  28. Breaking the symmetry ry vs GAMP GASP(s) Replica symmetry 1RSB assumption assumption Input scalar channel: Input scalar channel: SYMMETRY BREAKING PARAMETER • Same computational complexity • (Potentially) more expensive element-wise operations • How to set the symmetry breaking parameter s ?

  29. Message-passing equations GAMP GASP(s)

  30. Changing the Effective Landscape Phase retrieval, noiseless case No regularizer : explore minima at different energy levels RS GROUND STATE Energy RS : Hard below 1RSB : Hard below RS Perfect recovery!

  31. Changing the Effective Landscape Phase retrieval, noiseless case No regularizer 1RSB : explore minima at different energy/complexity levels RS GROUND STATE Energy RS : Hard below 1RSB : Hard below 1RSB RS Perfect recovery!

  32. Changing the Effective Landscape Phase retrieval, noiseless case No regularizer 1RSB : explore minima at different energy/complexity levels RS GROUND STATE Energy RS : Hard below 1RSB : Hard below 1RSB RS Perfect recovery!

  33. Conclusions • In mismatched inference settings the RS assumption can be wrong. • GASP can improve over GAMP . Same O(N^2) complexity. • Simple continuation strategy can push GASP down to the BO algorithmic threshold. • For more details please check my poster this evening! THANK YOU FOR YOUR ATTENTION!

Recommend


More recommend