quantum streaming algorithms for dyck 2
play

Quantum Streaming Algorithms for DYCK(2) ASHWIN NAYAK, AND DAVE - PowerPoint PPT Presentation

Augmented Index and Quantum Streaming Algorithms for DYCK(2) ASHWIN NAYAK, AND DAVE TOUCHETTE University of Waterloo, Perimeter Institute CCC 2017, Riga, Latvia 8 July 2017 Communication Complexity Communication Complexity setting:


  1. Augmented Index and Quantum Streaming Algorithms for DYCK(2) ASHWIN NAYAK, AND DAVE TOUCHETTE University of Waterloo, Perimeter Institute CCC 2017, Riga, Latvia 8 July 2017

  2. Communication Complexity β—¦ Communication Complexity setting: 𝑆 𝐡 𝑆 𝐢 π‘Œ, 𝑍 𝐷 1 = 𝑔 1 (π‘Œ, 𝑆 𝐡 ) 𝐷 2 = 𝑔 2 (𝑍, 𝑆 𝐢 , 𝐷 1 ) … 𝐷 𝑁 = 𝑔 𝑁 (𝑍, 𝑆 𝐢 , 𝐷 <𝑁 ) Output: f(x,y) β—¦ How much communication to compute f on x, y β—¦ Take information-theoretic view: Information Complexity β—¦ How much information to compute f on x, y ∼ 𝜈 β—¦ Information content of interactive protocols? β—¦ Classical vs. Quantum?

  3. Communication Complexity β—¦ Communication Complexity setting: 𝑆 𝐡 𝑆 𝐢 π‘Œ, 𝑍 𝐷 1 = 𝑔 1 (π‘Œ, 𝐡, 𝑆 𝐡 ) 𝐷 2 = 𝑔 2 (𝑍, 𝑆 𝐢 , 𝐷 1 ) 𝜚 𝑗 𝐡 𝑗 𝜚 𝑗 𝐢 𝑗 … 𝐷 𝑁 = 𝑔 𝑁 (𝑍, 𝑆 𝐢 , 𝐷 <𝑁 ) Output: f(x,y) β—¦ How much communication to compute f on x, y β—¦ Take information-theoretic view: Information Complexity β—¦ How much information to compute f on x, y ∼ 𝜈 β—¦ Information content of interactive protocols? β—¦ Classical vs. Quantum?

  4. Communication Complexity β—¦ Communication Complexity setting: 𝑆 𝐡 𝑆 𝐢 π‘Œ, 𝑍 𝜚 1 𝐷 1 𝜚 2 𝐷 2 𝜚 𝑗 𝐡 𝑗 𝜚 𝑗 𝐢 𝑗 … 𝜚 𝑁 𝐷 𝑁 Output: f(x,y) β—¦ How much communication to compute f on x, y β—¦ Take information-theoretic view: Information Complexity β—¦ How much information to compute f on x, y ∼ 𝜈 β—¦ Information content of interactive protocols? β—¦ Classical vs. Quantum?

  5. Communication Complexity β—¦ Communication Complexity setting: |πœ”βŒͺ π‘Œ, 𝑍 𝜚 1 𝐷 1 𝜚 2 𝐷 2 𝜚 𝑗 𝐡 𝑗 𝜚 𝑗 𝐢 𝑗 … 𝜚 𝑁 𝐷 𝑁 Output: f(x,y) β—¦ How much communication to compute f on x, y β—¦ Take information-theoretic view: Information Complexity β—¦ How much information to compute f on x, y ∼ 𝜈 β—¦ Information content of interactive protocols? β—¦ Classical vs. Quantum?

  6. Quantum Communication Complexity π‘Œ π‘Œπ΅ 1 π‘Œπ΅ 2 π‘Œπ΅ 3 π‘Œπ΅ 𝑁 π‘Œ Protocol Ξ : 𝐡 𝑔 𝑉 1 𝑉 3 𝑉 𝑔 𝐡 0 Output: f(X,Y) 𝐷 3 𝐷 1 𝐷 π‘βˆ’1 𝜈 | πœ”βŒͺ 𝐷 𝑁 𝐷 2 𝐢 0 𝐢 𝑔 𝑉 2 𝑉 𝑁 𝑍𝐢 π‘βˆ’1 𝑍𝐢 2 𝑍 𝑍

  7. Th.1: Streaming Algorithms for DYCK(2) 𝑦 𝑂 𝑦 1 𝑦 2 |0 𝑑(𝑂) βŒͺ … 𝑃 𝑦 𝑂 𝑃 𝑦 1 𝑃 𝑦 2 Pre Post Repeat T times β—¦ Streaming algorithms: Attractive model for early Quantum Computers β—¦ Some exponential advantages possible for specially crafted problems [LeG06, GKKRdW07] β—¦ DYCK 2 = πœ— + 𝐸𝑍𝐷𝐿 2 + 𝐸𝑍𝐷𝐿 2 + 𝐸𝑍𝐷𝐿 2 β‹… 𝐸𝑍𝐷𝐿 2 𝑂 β—¦ Classical bound: 𝑑 𝑂 ∈ Ξ© [MMN10, JN10, CCKM10] π‘ˆ β—¦ Two-way classical algorithm: 𝑑 𝑂 ∈ O(π‘žπ‘π‘šπ‘§π‘šπ‘π‘•(𝑂))

  8. Th.1: Streaming Algorithms for DYCK(2) 𝑂 β—¦ Th. 1: Any T-pass 1-way qu. streaming algo. for DYCK(2) needs space 𝑑 𝑂 ∈ Ξ©( π‘ˆ 3 ) on length N inputs β—¦ Even holds for non-unitary streaming operations 𝑃 β—¦ Reduction from multi-party QCC to streaming algorithm to DYCK(2) [MMN10] β—¦ Multi-party problem consists of OR of multiple instances of two-party problem β—¦ Space s(N) in algorithm corresponds to communication between parties β—¦ Consider T-pass, one-way quantum streaming algorithms β—¦ Direct sum argument allows to reduce from a two-party problem, Augmented Index β—¦ Multi-party QCC lower bounds requires two- party QIC lower bound on β€œeasy distribution” β—¦ Subtlety for non-unitary streaming operations 𝑃

  9. Th.2: Augmented Index β—¦ Index 𝑦 1 … 𝑦 𝑗 … 𝑦 π‘œ , 𝑗 = 𝑦 𝑗 β—¦ Augmented Index: 𝐡𝐽 π‘œ 𝑦 1 … 𝑦 π‘œ , 𝑗, 𝑦 1 … 𝑦 <𝑗 , 𝑐 = 𝑦 𝑗 βŠ• 𝑐 𝑦 𝑧 𝑦 1 𝑦 1 𝑦 2 𝑦 2 … … 𝑦 π‘—βˆ’1 𝑦 π‘—βˆ’1 𝑦 𝑗 𝑗 ∈ [π‘œ] … 𝑦 π‘œβˆ’1 b ∈ {0, 1} 𝑦 𝑗 βŠ• 𝑐 𝑦 π‘œ

  10. Th.2: Augmented Index β—¦ Th. 2.2: For any r-round protocol Ξ  for 𝐡𝐽 π‘œ , either π‘œ β—¦ 𝑅𝐽𝐷 𝐡→𝐢 Ξ , 𝜈 0 ∈ Ξ© or 𝑠 2 1 β—¦ 𝑅𝐽𝐷 𝐢→𝐡 Ξ , 𝜈 0 ∈ Ξ© with 𝑠 2 β—¦ 𝜈 0 the uniform distribution on zeros of 𝐡𝐽 π‘œ (β€œeasy distribution”) β—¦ Classical bounds: π‘œ β—¦ 𝐽𝐷 𝐡→𝐢 Ξ , 𝜈 0 ∈ Ξ© 2 𝑛 or β—¦ 𝐽𝐷 𝐢→𝐡 Ξ , 𝜈 0 ∈ Ξ© 𝑛 β—¦ [MMN10, JN10, CCKM10, CK11] β—¦ We Build on direct sum approach of [JN10] β—¦ General approach uses two main Tools (Sup.-Average Encoding Th., Qu. Cut-and-Paste) β—¦ More specialized approach uses one more Tool (Information Flow Lemma)

  11. Warm-up: Disjointness πΈπ‘—π‘‘π‘˜ π‘œ (𝑦, 𝑧) = Β¬Ϊ€ π‘—βˆˆ[π‘œ] (𝑦 𝑗 ∧ 𝑧 𝑗 ) 𝐷𝐷 πΈπ‘—π‘‘π‘˜ π‘œ ∈ Ξ©(π‘œ) 𝑦 𝑧 𝑦 1 𝑧 1 𝑦 2 𝑧 2 … … 𝑦 π‘—βˆ’1 𝑧 π‘—βˆ’1 𝑦 𝑗 AND 𝑧 𝑗 … … 𝑦 π‘œβˆ’1 𝑧 π‘œβˆ’1 𝑦 π‘œ 𝑧 π‘œ

  12. Warm-up: Disjointness CC πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝐽𝐷 0 πΈπ‘—π‘‘π‘˜ π‘œ = π‘œ 𝐽𝐷 0 𝐡𝑂𝐸 [BJKS02] 𝐽𝐷 ≀ 𝐷𝐷, 𝐽𝐷 satisfies direct sum property (needs private and public randomness) 2 2 𝐽𝐷 0 (𝐡𝑂𝐸) = 3 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 3 𝐽(𝑍: 𝑁|π‘Œ = 0) Comparing message transcript 𝑁 on 01, 00, 10 inputs: ?

  13. Tool: Average Encoding Theorem β—¦ Average encoding theorem [KNTZ01]: E π‘Œ β„Ž 2 𝑁 π‘Œ , 𝑁 βˆ— ≀ 𝐽(π‘Œ: 𝑁) β—¦ 𝑁 βˆ— = E π‘Œ [𝑁 π‘Œ ] , average message β—¦ h 2 𝑁 1 𝑁 2 , Heilinger distance β—¦ Follows from Pinsker’s inequality β—¦ Low information messages are close to average message β—¦ For AND, 𝑍 = 0 : 1 2 β„Ž 2 𝑁 00 , 𝑁 βˆ—0 + 1 2 β„Ž 2 𝑁 10 , 𝑁 βˆ—0 ≀ 𝐽(π‘Œ: 𝑁|𝑍 = 0) β—¦ Using Jensen and triangle inequality: 1 4 β„Ž 2 𝑁 00 , 𝑁 10 ≀ 𝐽 π‘Œ: 𝑁 𝑍 = 0 β—¦ Similarly, for X=0: 1 4 β„Ž 2 𝑁 00 , 𝑁 01 ≀ 𝐽 𝑍: 𝑁 π‘Œ = 0 1 8 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 𝐽 𝑍: 𝑁 π‘Œ = 0 = 3 β—¦ Comparing 01, 10 inputs: 2 𝐽𝐷 0 (𝐡𝑂𝐸)

  14. Warm-up: Disjointness CC πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝐽𝐷 0 πΈπ‘—π‘‘π‘˜ π‘œ = π‘œ 𝐽𝐷 0 𝐡𝑂𝐸 [BJKS02] 𝐽𝐷 ≀ 𝐷𝐷, 𝐽𝐷 satisfies direct sum property, needs private and public randomness 2 2 𝐽𝐷 0 (𝐡𝑂𝐸) = 3 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 3 𝐽(𝑍: 𝑁|π‘Œ = 0) 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing message transcript 𝑁 on 01, 00, 10 inputs:

  15. Warm-up: Disjointness CC πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝐽𝐷 0 πΈπ‘—π‘‘π‘˜ π‘œ = π‘œ 𝐽𝐷 0 𝐡𝑂𝐸 [BJKS02] 𝐽𝐷 ≀ 𝐷𝐷, 𝐽𝐷 satisfies direct sum property, needs private and public randomness 2 2 𝐽𝐷 0 (𝐡𝑂𝐸) = 3 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 3 𝐽(𝑍: 𝑁|π‘Œ = 0) 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing message transcript 𝑁 on 01, 00, 10 inputs: Comparing 𝑁 on 00, 11 inputs: ?

  16. Tool: Cut-and-Paste Lemma β—¦ Consider input subset {𝑦 1 , 𝑦 2 } Γ— {𝑧 1 , 𝑧 2 } 𝑦 1 𝑧 1 𝑦 2 𝑧 2 β—¦ Triangle inequality implies for 𝑁 on 𝑦 1 , 𝑧 2 and (𝑦 2 , 𝑧 1 ) : β—¦ β„Ž 𝑁 𝑦 1 𝑧 2 , 𝑁 𝑦 2 𝑧 1 ≀ β„Ž 𝑁 𝑦 1 𝑧 1 , 𝑁 𝑦 1 𝑧 2 + β„Ž(𝑁 𝑦 1 𝑧 1 , 𝑁 𝑦 2 𝑧 1 ) 𝑦 1 𝑧 1 𝑦 1 𝑧 1 𝑦 1 𝑧 1 + β‡’ 𝑦 2 𝑧 2 𝑦 2 𝑧 2 𝑦 2 𝑧 2 β—¦ What about 𝑁 on 𝑦 1 , 𝑧 1 and (𝑦 2 , 𝑧 2 ) : ? β—¦ Cut-and-paste Lemma [BJKS02]: β„Ž 𝑁 𝑦 1 𝑧 1 , 𝑁 𝑦 2 𝑧 2 = β„Ž 𝑁 𝑦 1 𝑧 2 , 𝑁 𝑦 2 𝑧 1 𝑦 1 𝑧 1 𝑦 1 𝑧 1 β‡’ 𝑦 2 𝑧 2 𝑦 2 𝑧 2

  17. Warm-up: Disjointness CC πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝐽𝐷 0 πΈπ‘—π‘‘π‘˜ π‘œ = π‘œ 𝐽𝐷 0 𝐡𝑂𝐸 [BJKS02] 𝐽𝐷 ≀ 𝐷𝐷, 𝐽𝐷 satisfies direct sum property, needs private and public randomness 2 2 𝐽𝐷 0 (𝐡𝑂𝐸) = 3 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 3 𝐽(𝑍: 𝑁|π‘Œ = 0) 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing message transcript 𝑁 on 01, 00, 10 inputs: 12 β„Ž 2 𝑁 00 , 𝑁 11 = 1 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing 𝑁 on 00, 11 inputs:

  18. Warm-up: Disjointness CC πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝐽𝐷 0 πΈπ‘—π‘‘π‘˜ π‘œ = π‘œ 𝐽𝐷 0 𝐡𝑂𝐸 [BJKS02] 𝐽𝐷 ≀ 𝐷𝐷, 𝐽𝐷 satisfies direct sum property, needs private and public randomness 2 2 𝐽𝐷 0 (𝐡𝑂𝐸) = 3 𝐽 π‘Œ: 𝑁 𝑍 = 0 + 3 𝐽(𝑍: 𝑁|π‘Œ = 0) 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing message transcript 𝑁 on 01, 00, 10 inputs: 12 β„Ž 2 𝑁 00 , 𝑁 11 = 1 12 β„Ž 2 𝑁 10 , 𝑁 01 ≀ 𝐽𝐷 0 (𝐡𝑂𝐸) 1 Comparing 𝑁 on 00, 11 inputs: 1 Statistical interpretation: β„Ž 𝑁 1 , 𝑁 2 β‰₯ 4 | 𝑁 1 βˆ’ 𝑁 2 | π‘ˆπ‘Š ||𝑁 00 βˆ’ 𝑁 11 || π‘ˆπ‘Š ∈ Ξ© 1 since for AND, 𝑁 00 β†’ 0, 𝑁 11 β†’ 1 𝐷𝐷 πΈπ‘—π‘‘π‘˜ π‘œ ∈ Ξ©(π‘œ) Quantum?

  19. Warm-up: Disjointness 𝑅𝐷𝐷 πΈπ‘—π‘‘π‘˜ π‘œ ∈ Θ( π‘œ) 𝑅𝐷𝐷 𝑠 πΈπ‘—π‘‘π‘˜ π‘œ ∈ O( π‘œ 𝑠 ) , r round protocols 1 𝑠 πΈπ‘—π‘‘π‘˜ π‘œ = 1 𝑠 𝐡𝑂𝐸 [JRS03] QCC r πΈπ‘—π‘‘π‘˜ π‘œ β‰₯ 𝑠 ΰ·ͺ 𝑠 π‘œ ΰ·ͺ 𝑅𝐽𝐷 0 𝑅𝐽𝐷 0 ΰ·ͺ ΰ·ͺ 𝑅𝐽𝐷 ≀ 𝑠 𝑅𝐷𝐷, 𝑅𝐽𝐷 satisfies direct sum property, requires private and public β€œrandomness” Comparing 𝑁 on 01, 00, 10 inputs: ?

Recommend


More recommend