quantum chebyshev s inequality and applications
play

Quantum Chebyshevs Inequality and Applications Yassine Hamoudi, - PowerPoint PPT Presentation

Quantum Chebyshevs Inequality and Applications Yassine Hamoudi, Frdric Magniez IRIF , Universit Paris Diderot, CNRS QUDATA 2019 arXiv: 1807.06456 Buffons needle A needle dropped randomly on a floor with equally spaced parallel


  1. Quantum Chebyshev’s Inequality and Applications Yassine Hamoudi, Frédéric Magniez IRIF , Université Paris Diderot, CNRS QUDATA 2019 arXiv: 1807.06456

  2. Buffon’s needle A needle dropped randomly on a floor with equally spaced parallel lines will cross one of the lines with probability 2/ π . Bu ff on, G., Essai d'arithmétique morale , 1777. 2

  3. Monte Carlo algorithms: Use repeated random sampling and statistical analysis to estimate parameters of interest 3

  4. Monte Carlo algorithms: Use repeated random sampling and statistical analysis to estimate parameters of interest Empirical mean: 1/ Repeat the experiment n times: n i.i.d. samples x 1 , …, x n ~ X 2/ Output: (x 1 +…+ x n )/n 3

  5. Monte Carlo algorithms: Use repeated random sampling and statistical analysis to estimate parameters of interest Empirical mean: 1/ Repeat the experiment n times: n i.i.d. samples x 1 , …, x n ~ X 2/ Output: (x 1 +…+ x n )/n Law of large numbers: x 1 + . . . + x n n →∞ E ( X ) n 3

  6. ˜ μ = x 1 + . . . + x n Empirical mean: x 1 , . . . , x n ∼ X with n How fast does it converge to E(X) ? 4

  7. ˜ μ = x 1 + . . . + x n Empirical mean: x 1 , . . . , x n ∼ X with n How fast does it converge to E(X) ? Chebyshev’s Inequality: multiplicative error 0 < ε < 1 | ˜ Objective: with high probability ( finite) μ − E ( X ) | ≤ ϵ E ( X ) E ( X ), Var ( X ) ≠ 0 4

  8. ˜ μ = x 1 + . . . + x n Empirical mean: x 1 , . . . , x n ∼ X with n How fast does it converge to E(X) ? Chebyshev’s Inequality: multiplicative error 0 < ε < 1 | ˜ Objective: with high probability ( finite) μ − E ( X ) | ≤ ϵ E ( X ) E ( X ), Var ( X ) ≠ 0 Number of samples needed: O ( ϵ 2 E ( X ) 2 ) E ( X 2 ) ϵ 2 E ( X ) 2 ) = O ( E ( X 2 ) O ( ϵ 2 ( E ( X ) 2 − 1 )) Var ( X ) 1 (in fact ) 4

  9. ˜ μ = x 1 + . . . + x n Empirical mean: x 1 , . . . , x n ∼ X with n How fast does it converge to E(X) ? Chebyshev’s Inequality: multiplicative error 0 < ε < 1 | ˜ Objective: with high probability ( finite) μ − E ( X ) | ≤ ϵ E ( X ) E ( X ), Var ( X ) ≠ 0 Relative second moment Number of samples needed: O ( ϵ 2 E ( X ) 2 ) E ( X 2 ) ϵ 2 E ( X ) 2 ) = O ( E ( X 2 ) O ( ϵ 2 ( E ( X ) 2 − 1 )) Var ( X ) 1 (in fact ) 4

  10. ˜ μ = x 1 + . . . + x n Empirical mean: x 1 , . . . , x n ∼ X with n How fast does it converge to E(X) ? Chebyshev’s Inequality: multiplicative error 0 < ε < 1 | ˜ Objective: with high probability ( finite) μ − E ( X ) | ≤ ϵ E ( X ) E ( X ), Var ( X ) ≠ 0 Relative second moment Number of samples needed: O ( ϵ 2 E ( X ) 2 ) E ( X 2 ) ϵ 2 E ( X ) 2 ) = O ( E ( X 2 ) O ( ϵ 2 ( E ( X ) 2 − 1 )) Var ( X ) 1 (in fact ) n = Ω ( ϵ 2 ) Δ 2 Δ 2 ≥ E ( X 2 ) In practice: given an upper-bound , take samples E ( X ) 2 4

  11. Applications Counting with Markov chain Monte Carlo methods: Counting vs. sampling [Jerrum, Sinclair’96] [ Š tefankovi č et al.’09], Volume of convex bodies [Dyer, Frieze'91], Permanent [Jerrum, Sinclair, Vigoda’04] Data stream model: Frequency moments, Collision probability [Alon, Matias, Szegedy’99] [Monemizadeh, Woodru ff ’] [Andoni et al.’11] [Crouch et al.’16] Testing properties of distributions: Closeness [Goldreich, Ron’11] [Batu et al.’13] [Chan et al.’14] , Conditional independence [Canonne et al.’18] Estimating graph parameters: Number of connected components, Minimum spanning tree weight [Chazelle, Rubinfeld, Trevisan’05] , Average distance [Goldreich, Ron’08] , Number of triangles [Eden et al. 17] etc. 5

  12. Random variable X over sample space Ω ⊂ R + Classical sample: one value x ∈ Ω , sampled with probability p x 6

  13. Random variable X over sample space Ω ⊂ R + Classical sample: one value x ∈ Ω , sampled with probability p x Quantum sample: one (controlled-) execution of a quantum sampler or , where S − 1 S X X S X | 0 ⟩ = ∑ p x | ψ x ⟩ | x ⟩ x ∈Ω with ψ x = arbitrary unit vector 6

  14. Can we use quadratically less samples in the quantum setting? 7

  15. Can we use quadratically less samples in the quantum setting? Number of samples Conditions Δ 2 Δ 2 ≥ E ( X 2 ) Classical samples (Chebyshev’s E ( X ) 2 ϵ 2 inequality) B [Brassard et al.’11] Sample space [Wocjan et al.’09] Ω ⊂ [0,B] ϵ E ( X ) [Montanaro’15] Δ 2 Δ 2 ≥ E ( X 2 ) [Montanaro’15] E ( X ) 2 ϵ Δ 2 ≥ E ( X 2 ) Δ ϵ ⋅ H L ≤ E(X) ≤ H [Li, Wu’17] E ( X ) 2 L ( E ( X ) ) Δ Δ 2 ≥ E ( X 2 ) H Our result E(X) ≤ H ϵ ⋅ log 3 E ( X ) 2 7

  16. Can we use quadratically less samples in the quantum setting? Number of samples Conditions Δ 2 Δ 2 ≥ E ( X 2 ) Classical samples (Chebyshev’s E ( X ) 2 ϵ 2 inequality) B [Brassard et al.’11] Sample space [Wocjan et al.’09] Ω ⊂ [0,B] ϵ E ( X ) [Montanaro’15] Δ 2 Δ 2 ≥ E ( X 2 ) [Montanaro’15] E ( X ) 2 ϵ Δ 2 ≥ E ( X 2 ) Δ ϵ ⋅ H L ≤ E(X) ≤ H [Li, Wu’17] E ( X ) 2 L ( E ( X ) ) Δ Δ 2 ≥ E ( X 2 ) H Our result E(X) ≤ H ϵ ⋅ log 3 E ( X ) 2 7

  17. Can we use quadratically less samples in the quantum setting? Number of samples Conditions Δ 2 Δ 2 ≥ E ( X 2 ) Classical samples (Chebyshev’s E ( X ) 2 ϵ 2 inequality) B [Brassard et al.’11] Sample space [Wocjan et al.’09] Ω ⊂ [0,B] ϵ E ( X ) [Montanaro’15] Δ 2 Δ 2 ≥ E ( X 2 ) [Montanaro’15] E ( X ) 2 ϵ Δ 2 ≥ E ( X 2 ) Δ ϵ ⋅ H L ≤ E(X) ≤ H [Li, Wu’17] E ( X ) 2 L ( E ( X ) ) Δ Δ 2 ≥ E ( X 2 ) H Our result E(X) ≤ H ϵ ⋅ log 3 E ( X ) 2 7

  18. Can we use quadratically less samples in the quantum setting? Number of samples Conditions Δ 2 Δ 2 ≥ E ( X 2 ) Classical samples (Chebyshev’s E ( X ) 2 ϵ 2 inequality) B [Brassard et al.’11] Sample space [Wocjan et al.’09] Ω ⊂ [0,B] ϵ E ( X ) [Montanaro’15] Δ 2 Δ 2 ≥ E ( X 2 ) [Montanaro’15] E ( X ) 2 ϵ Δ 2 ≥ E ( X 2 ) Δ ϵ ⋅ H L ≤ E(X) ≤ H [Li, Wu’17] E ( X ) 2 L ( E ( X ) ) Δ Δ 2 ≥ E ( X 2 ) H Our result E(X) ≤ H ϵ ⋅ log 3 E ( X ) 2 7

  19. Can we use quadratically less samples in the quantum setting? Number of samples Conditions Δ 2 Δ 2 ≥ E ( X 2 ) Classical samples (Chebyshev’s E ( X ) 2 ϵ 2 inequality) B [Brassard et al.’11] Sample space [Wocjan et al.’09] Ω ⊂ [0,B] ϵ E ( X ) [Montanaro’15] Δ 2 Δ 2 ≥ E ( X 2 ) [Montanaro’15] E ( X ) 2 ϵ Δ 2 ≥ E ( X 2 ) Δ ϵ ⋅ H L ≤ E(X) ≤ H [Li, Wu’17] E ( X ) 2 L ( E ( X ) ) Δ Δ 2 ≥ E ( X 2 ) H Our result E(X) ≤ H ϵ ⋅ log 3 E ( X ) 2 7

  20. Our Approach

  21. Amplitude Estimation Algorithm [Brassard et al.’02] [Brassard et al.’11] [Wocjan et al.’09] [Montanaro’15] Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ 9

  22. Amplitude Estimation Algorithm [Brassard et al.’02] [Brassard et al.’11] [Wocjan et al.’09] [Montanaro’15] Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ E ( X 2 ) B ≤ E ( X 2 ) If : the number of samples is O E ( X ) ϵ E ( X ) 10

  23. Amplitude Estimation Algorithm [Brassard et al.’02] [Brassard et al.’11] [Wocjan et al.’09] [Montanaro’15] Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ E ( X 2 ) B ≤ E ( X 2 ) If : the number of samples is O E ( X ) ϵ E ( X ) ? B ≫ E ( X 2 ) If E ( X ) 10

  24. Random variable X p x 1 x 0 0 B Largest outcome 11

  25. Random variable X b p x 1 x 0 ≈ E ( X 2 ) b 0 B E ( X ) New largest outcome 12

  26. Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ E ( X 2 ) B ≤ E ( X 2 ) If : the number of samples is O E ( X ) ϵ E ( X ) ? B ≫ E ( X 2 ) E ( X 2 ) If : map the outcomes larger than to 0 E ( X ) E ( X ) 13

  27. Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ E ( X 2 ) B ≤ E ( X 2 ) If : the number of samples is O E ( X ) ϵ E ( X ) B ≫ E ( X 2 ) E ( X 2 ) If : map the outcomes larger than to 0 E ( X ) E ( X ) b ≥ E ( X 2 ) Lemma: If then (1 − ϵ ) E ( X ) ≤ E ( X b ) ≤ E ( X ) . ϵ E ( X ) 13

  28. Input: Random variable X on sample space Ω ⊂ [0,B] Ampl-Est: O ( E ( X ) ) quantum samples to obtain | ˜ B μ − E ( X ) | ≤ ϵ ⋅ E ( X ) ϵ E ( X 2 ) B ≤ E ( X 2 ) If : the number of samples is O E ( X ) ϵ E ( X ) B ≫ E ( X 2 ) E ( X 2 ) If : map the outcomes larger than to 0 E ( X ) E ( X ) b ≥ E ( X 2 ) Lemma: If then (1 − ϵ ) E ( X ) ≤ E ( X b ) ≤ E ( X ) . ϵ E ( X ) Δ 2 ≥ E ( X 2 ) Problem: given how to find a threshold ? b ≈ E ( X ) ⋅ Δ 2 E ( X ) 2 13

Recommend


More recommend