probabilistic computation
play

Probabilistic Computation Lecture 13 BPP vs. PH 1 Recap 2 Recap - PowerPoint PPT Presentation

Probabilistic Computation Lecture 13 BPP vs. PH 1 Recap 2 Recap Probabilistic computation 2 Recap Probabilistic computation NTM (on random certificates) for L: 2 Recap Probabilistic computation NTM (on random certificates)


  1. Probabilistic Computation Lecture 13 BPP vs. PH 1

  2. Recap 2

  3. Recap Probabilistic computation 2

  4. Recap Probabilistic computation NTM (on “random certificates”) for L: 2

  5. Recap Probabilistic computation NTM (on “random certificates”) for L: Pr[M(x)=yes]: 2

  6. Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: 2

  7. Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: 2

  8. Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: x ∉ L x ∈ L BPTM for L: Pr[yes]: 2

  9. Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: x ∉ L x ∈ L BPTM for L: Pr[yes]: x ∉ L x ∈ L RTM for L: Pr[yes]: 2

  10. Recap 3

  11. Recap PP, RP, co-RP, BPP 3

  12. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP 3

  13. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap 3

  14. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp 3

  15. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model 3

  16. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: 3

  17. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: NP ⊈ BPP, unless PH collapses 3

  18. Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: NP ⊈ BPP, unless PH collapses BPP ⊆ Σ 2P ∩ Π 2P 3

  19. BPP vs. NP 4

  20. BPP vs. NP Can randomized algorithms efficiently decide all NP problems? 4

  21. BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P 4

  22. BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly 4

  23. BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly Then NP ⊆ BPP ⇒ NP ⊆ P/poly 4

  24. BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly Then NP ⊆ BPP ⇒ NP ⊆ P/poly ⇒ PH = Σ 2P 4

  25. BPP ⊆ P/poly 5

  26. BPP ⊆ P/poly If error probability is sufficiently small, will show there should be at least one random tape which works for all 2 n inputs of length n 5

  27. BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5

  28. BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5

  29. BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ One such random tape if ☑ ☑ ☑ ☒ ☑ ☑ average (over x) error ☑ ☑ ☑ ☑ ☒ ☑ probability is less than 2 -n ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5

  30. BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ One such random tape if ☑ ☑ ☑ ☒ ☑ ☑ average (over x) error ☑ ☑ ☑ ☑ ☒ ☑ probability is less than 2 -n ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ BPP: can make worst ☑ ☑ ☑ ☑ ☑ ☒ error probability < 2 -n ☑ ☑ ☑ ☑ ☑ ☑ 5

  31. BPP vs. PH 6

  32. BPP vs. PH BPP ⊆ Σ 2P 6

  33. BPP vs. PH BPP ⊆ Σ 2P So BPP ⊆ Σ 2P ∩ Π 2P 6

  34. BPP ⊆ Σ 2P 7

  35. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes 7

  36. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r 7

  37. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } 7

  38. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP 7

  39. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP L = { x| ∃ a small “neighborhood”, ∀ r’, for some r “near” r’, M(x,r)=yes } 7

  40. BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP L = { x| ∃ a small “neighborhood”, ∀ r’, for some r “near” r’, M(x,r)=yes } Note: Neighborhood of r is small (polynomially large), so can go through all of them in polynomial time 7

  41. BPP ⊆ Σ 2P Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8

  42. BPP ⊆ Σ 2P x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8

  43. BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8

  44. BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } x ∈ L: Will show that there exist a small set of shifts of Yes x that cover all r’ s 8

  45. BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } x ∈ L: Will show that there exist a small set of shifts of Yes x that cover all r’ s x ∉ L: Yes x very small, so its few shifts cover only a small region 8

  46. BPP ⊆ Σ 2P 9

  47. BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } 9

  48. BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) 9

  49. BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? 9

  50. BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? Yes! For all large S (like Yes x ) can indeed find a P s.t. P(S) = {0,1} m 9

  51. BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? Yes! For all large S (like Yes x ) can indeed find a P s.t. P(S) = {0,1} m In fact, most P work (if k big enough)! 9

  52. BPP ⊆ Σ 2P 10

  53. BPP ⊆ Σ 2P Probabilistic Method (finding hay in haystack) 10

  54. BPP ⊆ Σ 2P Probabilistic Method (finding hay in haystack) To prove ∃ P with some property 10

Recommend


More recommend