Quantum Chebyshev’s Inequality and Applications Yassine Hamoudi, Frédéric Magniez IRIF , Université de Paris, CNRS
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 2
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 ˜ μ = x 1 + . . . + x n Sample mean: n 2
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 ˜ μ = x 1 + . . . + x n Sample mean: n B Cherno ff ’s Bound: ϵ 2 E( X ) 2
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 ˜ μ = x 1 + . . . + x n Sample mean: n B Cherno ff ’s Bound: ϵ 2 E( X ) Var( X ) B Bernstein’s Inequality: ( Var( X ) ≤ B ⋅ E( X ) ) ϵ 2 E( X ) 2 + ϵ E( X ) 2
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 ˜ μ = x 1 + . . . + x n Sample mean: n B Cherno ff ’s Bound: ϵ 2 E( X ) Var( X ) B Bernstein’s Inequality: ( Var( X ) ≤ B ⋅ E( X ) ) ϵ 2 E( X ) 2 + ϵ E( X ) Var( X ) Chebyshev’s Inequality: ϵ 2 E( X ) 2 2
Mean Estimation Problem How many i.i.d. samples x 1 , x 2 ,… from some unknown ˜ bounded r.v. X ∈ [0,B] do we need to compute such that μ | ˜ μ − E( X ) | ≤ ϵ E( X ) with proba. 2/3 ˜ μ = x 1 + . . . + x n Sample mean: n B Cherno ff ’s Bound: ϵ 2 E( X ) Var( X ) B Bernstein’s Inequality: ( Var( X ) ≤ B ⋅ E( X ) ) ϵ 2 E( X ) 2 + ϵ E( X ) Var( X ) Chebyshev’s Inequality: ϵ 2 E( X ) 2 Δ 2 In practice: we often know Δ 2 ≥ E( X 2 ) = Var( X 2 ) take samples E( X ) 2 + 1 E( X ) 2 ϵ 2 2
Applications Counting with Markov chain Monte Carlo methods: Counting vs. sampling [Jerrum, Sinclair’96] [ Š tefankovi č et al.’09], Volume of convex bodies [Dyer, Frieze'91], Permanent [Jerrum, Sinclair, Vigoda’04] Data stream model: Frequency moments, Collision probability [Alon, Matias, Szegedy’99] [Monemizadeh, Woodru ff ’] [Andoni et al.’11] [Crouch et al.’16] Testing properties of distributions: Closeness [Goldreich, Ron’11] [Batu et al.’13] [Chan et al.’14] , Conditional independence [Canonne et al.’18] Estimating graph parameters: Number of connected components, Minimum spanning tree weight [Chazelle, Rubinfeld, Trevisan’05] , Average distance [Goldreich, Ron’08] , Number of triangles [Eden et al. 17] etc. 3
Quantum Mean Estimation Problem Random variable X on finite sample space Ω ⊂ [0,B]. Classical sample: one value x ∈ Ω , sampled with probability p x 4
Quantum Mean Estimation Problem Random variable X on finite sample space Ω ⊂ [0,B]. Classical sample: one value x ∈ Ω , sampled with probability p x Quantum sample: one use of a unitary operator or satisfying S − 1 S X X S X | 0 ⟩ = ∑ p x | x ⟩ x ∈Ω 4
Quantum Mean Estimation Problem Random variable X on finite sample space Ω ⊂ [0,B]. Classical sample: one value x ∈ Ω , sampled with probability p x Quantum sample: one use of a unitary operator or satisfying S − 1 S X X S X | 0 ⟩ = ∑ | ψ x ⟩ p x | x ⟩ x ∈Ω with ψ x = arbitrary unit vector 4
Quantum Mean Estimation Problem Random variable X on finite sample space Ω ⊂ [0,B]. Classical sample: one value x ∈ Ω , sampled with probability p x Quantum sample: one use of a unitary operator or satisfying S − 1 S X X S X | 0 ⟩ = ∑ | ψ x ⟩ p x | x ⟩ x ∈Ω with ψ x = arbitrary unit vector Question: can we estimate E(X) with less samples in the quantum setting? 4
Quantum Mean Estimation Problem Classical samples Quantum samples (Cherno ff ) (Amplitude Estimation) B B ϵ 2 E( X ) ϵ E( X ) (Chebyshev) Δ 2 [Montanaro’15]: ϵ Δ 2 Δ 2 ≥ E( X 2 ) given E( X ) 2 ( E( X ) ) ϵ 2 Δ B ϵ ⋅ log 3 Our contribution: 5
Quantum Mean Estimation Problem Classical samples Quantum samples (Cherno ff ) (Amplitude Estimation) B B ϵ 2 E( X ) ϵ E( X ) (Chebyshev) Δ 2 [Montanaro’15]: ϵ Δ 2 Δ 2 ≥ E( X 2 ) given E( X ) 2 ( E( X ) ) ϵ 2 Δ B ϵ ⋅ log 3 Our contribution: 5
Quantum Mean Estimation Problem Classical samples Quantum samples (Cherno ff ) (Amplitude Estimation) B B ϵ 2 E( X ) ϵ E( X ) (Chebyshev) Δ 2 [Montanaro’15]: ϵ Δ 2 Δ 2 ≥ E( X 2 ) given E( X ) 2 ( E( X ) ) ϵ 2 Δ B ϵ ⋅ log 3 Our contribution: 5
Quantum Mean Estimation Problem Classical samples Quantum samples (Cherno ff ) (Amplitude Estimation) B B ϵ 2 E( X ) ϵ E( X ) (Chebyshev) Δ 2 [Montanaro’15]: ϵ Δ 2 Δ 2 ≥ E( X 2 ) given E( X ) 2 ( E( X ) ) ϵ 2 Δ B ϵ ⋅ log 3 Our contribution: 5
Our Approach
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ 7
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) 7
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) ? B ≫ E( X 2 ) If E( X ) 7
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) ? B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) 7
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) ? B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) p x 1 Random variable X x 0 0 B 7
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) ? B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) p x 1 Random variable X b New largest outcome x 0 ≈ E( X 2 ) 0 B b 8 E( X )
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) b ≥ E( X 2 ) Lemma: If then (1 − ϵ )E( X ) ≤ E( X b ) ≤ E( X ) . ϵ E( X ) ⇒ We can equivalently estimate the mean of X b for b ≥ E( X 2 ) ϵ E( X ) 9
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) b ≥ E( X 2 ) Lemma: If then (1 − ϵ )E( X ) ≤ E( X b ) ≤ E( X ) . ϵ E( X ) b ≥ E( X 2 ) ⇒ We can equivalently estimate the mean of X b for ϵ E( X ) E( X 2 ) Problem: is unknown… E( X ) 9
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) b ≥ E( X 2 ) Lemma: If then (1 − ϵ )E( X ) ≤ E( X b ) ≤ E( X ) . ϵ E( X ) b ≥ E( X 2 ) ⇒ We can equivalently estimate the mean of X b for ϵ E( X ) E( X 2 ) Δ 2 ≥ E( X 2 ) Problem: is unknown… but we know E( X ) 2 E( X ) 9
O ( E( X ) ) B Amplitude-Estimation: quantum samples to estimate E(X) ϵ E( X 2 ) B ≤ E( X 2 ) If : the number of samples is O E( X ) ϵ E( X ) B ≫ E( X 2 ) E( X 2 ) If : map the outcomes larger than to 0 E( X ) E( X ) b ≥ E( X 2 ) Lemma: If then (1 − ϵ )E( X ) ≤ E( X b ) ≤ E( X ) . ϵ E( X ) b ≥ E( X 2 ) ⇒ We can equivalently estimate the mean of X b for ϵ E( X ) E( X 2 ) Δ 2 ≥ E( X 2 ) b ≈ E( X ) ⋅ Δ 2 ? Problem: is unknown… but we know E( X ) 2 E( X ) 9
Δ 2 ≥ E( X 2 ) Objective: given how to find a threshold ? b ≈ E( X ) ⋅ Δ 2 E( X ) 2 10
Δ 2 ≥ E( X 2 ) Objective: given how to find a threshold ? b ≈ E( X ) ⋅ Δ 2 E( X ) 2 Solution: use the Amplitude Estimation algorithm (again) to do a logarithmic search on b 10
Δ 2 ≥ E( X 2 ) Objective: given how to find a threshold ? b ≈ E( X ) ⋅ Δ 2 E( X ) 2 Solution: use the Amplitude Estimation algorithm (again) to do a logarithmic search on b Amplitude Threshold Input r.v. Number of samples Estimation ˜ Δ b 0 = B Δ 2 μ 0 X b 0 ˜ b 1 = ( B /2) Δ 2 X b 1 Δ μ 1 ˜ Δ b 2 = ( B /4) Δ 2 X b 2 μ 2 … … … Stopping rule: ˜ … Output: b i μ i ≠ 0 10
Recommend
More recommend