Probabilistic Computation Lecture 13 BPP vs. PH 1
Recap 2
Recap Probabilistic computation 2
Recap Probabilistic computation NTM (on “random certificates”) for L: 2
Recap Probabilistic computation NTM (on “random certificates”) for L: Pr[M(x)=yes]: 2
Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: 2
Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: 2
Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: x ∉ L x ∈ L BPTM for L: Pr[yes]: 2
Recap Probabilistic computation NTM (on “random certificates”) for L: x ∉ L x ∈ L Pr[M(x)=yes]: x ∉ L x ∈ L PTM for L: Pr[yes]: x ∉ L x ∈ L BPTM for L: Pr[yes]: x ∉ L x ∈ L RTM for L: Pr[yes]: 2
Recap 3
Recap PP, RP, co-RP, BPP 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: NP ⊈ BPP, unless PH collapses 3
Recap PP, RP, co-RP, BPP PP too powerful: NP ⊆ PP RP, BPP, with bounded gap Gap can be boosted from 1/poly to 1-1/exp A realistic/useful computational model Today: NP ⊈ BPP, unless PH collapses BPP ⊆ Σ 2P ∩ Π 2P 3
BPP vs. NP 4
BPP vs. NP Can randomized algorithms efficiently decide all NP problems? 4
BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P 4
BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly 4
BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly Then NP ⊆ BPP ⇒ NP ⊆ P/poly 4
BPP vs. NP Can randomized algorithms efficiently decide all NP problems? Unlikely: NP ⊆ BPP ⇒ PH = Σ 2P Will show BPP ⊆ P/poly Then NP ⊆ BPP ⇒ NP ⊆ P/poly ⇒ PH = Σ 2P 4
BPP ⊆ P/poly 5
BPP ⊆ P/poly If error probability is sufficiently small, will show there should be at least one random tape which works for all 2 n inputs of length n 5
BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5
BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5
BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ One such random tape if ☑ ☑ ☑ ☒ ☑ ☑ average (over x) error ☑ ☑ ☑ ☑ ☒ ☑ probability is less than 2 -n ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ 5
BPP ⊆ P/poly If error probability is x r sufficiently small, will show there should be at least one ☑ ☒ ☒ ☑ ☑ ☑ random tape which works for ☑ ☑ ☑ ☑ ☑ ☒ all 2 n inputs of length n ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ Then, can give that ☑ ☑ ☑ ☑ ☒ ☑ random tape as advice ☒ ☒ ☑ ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ One such random tape if ☑ ☑ ☑ ☒ ☑ ☑ average (over x) error ☑ ☑ ☑ ☑ ☒ ☑ probability is less than 2 -n ☑ ☑ ☑ ☒ ☑ ☑ ☑ ☑ ☑ ☑ ☑ ☑ BPP: can make worst ☑ ☑ ☑ ☑ ☑ ☒ error probability < 2 -n ☑ ☑ ☑ ☑ ☑ ☑ 5
BPP vs. PH 6
BPP vs. PH BPP ⊆ Σ 2P 6
BPP vs. PH BPP ⊆ Σ 2P So BPP ⊆ Σ 2P ∩ Π 2P 6
BPP ⊆ Σ 2P 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP L = { x| ∃ a small “neighborhood”, ∀ r’, for some r “near” r’, M(x,r)=yes } 7
BPP ⊆ Σ 2P x ∈ L: “for almost all” r, M(x,r)=yes x ∉ L: M(x,r)=yes for very few r L = { x| for almost all r, M(x,r)=yes } If it were “for all”, in coNP L = { x| ∃ a small “neighborhood”, ∀ r’, for some r “near” r’, M(x,r)=yes } Note: Neighborhood of r is small (polynomially large), so can go through all of them in polynomial time 7
BPP ⊆ Σ 2P Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8
BPP ⊆ Σ 2P x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8
BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } 8
BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } x ∈ L: Will show that there exist a small set of shifts of Yes x that cover all r’ s 8
BPP ⊆ Σ 2P x ∉ L: |Yes x |<2 -n 2 m x ∈ L: |Yes x |>(1-2 -n )2 m Space of random tapes = {0,1} m Yes x = {r| M(x,r)=yes } x ∈ L: Will show that there exist a small set of shifts of Yes x that cover all r’ s x ∉ L: Yes x very small, so its few shifts cover only a small region 8
BPP ⊆ Σ 2P 9
BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } 9
BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) 9
BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? 9
BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? Yes! For all large S (like Yes x ) can indeed find a P s.t. P(S) = {0,1} m 9
BPP ⊆ Σ 2P “A small set of shifts”: P = {u 1 ,u 2 ,...,u k } P(r)={ r ⊕ u 1 ,r ⊕ u 2 ,...,r ⊕ u k } where r, u i are m-bit strings, and k is “small” (poly(n)) For each x ∈ L, does there exist a P s.t. P(Yes x ) := ∪ r ∈ Yes(x) P(r) = {0,1} m ? Yes! For all large S (like Yes x ) can indeed find a P s.t. P(S) = {0,1} m In fact, most P work (if k big enough)! 9
BPP ⊆ Σ 2P 10
BPP ⊆ Σ 2P Probabilistic Method (finding hay in haystack) 10
BPP ⊆ Σ 2P Probabilistic Method (finding hay in haystack) To prove ∃ P with some property 10
Recommend
More recommend