typically correct derandomization for small time and space
play

Typically-Correct Derandomization for Small Time and Space William - PowerPoint PPT Presentation

Typically-Correct Derandomization for Small Time and Space William M. Hoza 1 University of Texas at Austin July 18 CCC 2019 1Supported by the NSF GRFP under Grant No. DGE1610403 and by a Harrington fellowship from UT Austin Time, space, and


  1. Typically-Correct Derandomization for Small Time and Space William M. Hoza 1 University of Texas at Austin July 18 CCC 2019 1Supported by the NSF GRFP under Grant No. DGE1610403 and by a Harrington fellowship from UT Austin

  2. Time, space, and randomness

  3. Derandomization ◮ Suppose L ∈ BPTISP ( T , S )

  4. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n

  5. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n

  6. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]:

  7. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]: ◮ Assume some language in DSPACE ( O ( n )) has exponential circuit complexity

  8. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]: ◮ Assume some language in DSPACE ( O ( n )) has exponential circuit complexity ◮ Then L ∈ DTISP (poly( T ) , S )

  9. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]: ◮ Assume some language in DSPACE ( O ( n )) has exponential circuit complexity ◮ Then L ∈ DTISP (poly( T ) , S ) ◮ Theorem [Nisan, Zuckerman ’96]:

  10. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]: ◮ Assume some language in DSPACE ( O ( n )) has exponential circuit complexity ◮ Then L ∈ DTISP (poly( T ) , S ) ◮ Theorem [Nisan, Zuckerman ’96]: ◮ Suppose S ≥ T Ω(1)

  11. Derandomization ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem [Klivans, van Melkebeek ’02]: ◮ Assume some language in DSPACE ( O ( n )) has exponential circuit complexity ◮ Then L ∈ DTISP (poly( T ) , S ) ◮ Theorem [Nisan, Zuckerman ’96]: ◮ Suppose S ≥ T Ω(1) ◮ Then L ∈ DSPACE ( S ) (runtime 2 Θ( S ) )

  12. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n

  13. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem :

  14. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem : ◮ Suppose T ≤ n · poly( S ) ◮ Think T = � O ( n ), S = O (log n )

  15. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem : ◮ Suppose T ≤ n · poly( S ) ◮ Then there is a DSPACE ( S ) algorithm for L ... ◮ Think T = � O ( n ), S = O (log n )

  16. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem : ◮ Suppose T ≤ n · poly( S ) ◮ Then there is a DSPACE ( S ) algorithm for L ... ◮ ...that succeeds on the vast majority of inputs of each length. ◮ Think T = � O ( n ), S = O (log n )

  17. Main result ◮ Suppose L ∈ BPTISP ( T , S ) ◮ T = T ( n ) ≥ n ◮ S = S ( n ) ≥ log n ◮ Theorem : ◮ Suppose T ≤ n · poly( S ) ◮ Then there is a DSPACE ( S ) algorithm for L ... ◮ ...that succeeds on the vast majority of inputs of each length. ◮ Think T = � O ( n ), S = O (log n ) ◮ [Saks, Zhou ’95]: Space Θ(log 1 . 5 n )

  18. Typically-correct derandomizations ◮ Is 110100001101001111001010110111011010100011100 ∈ L ?

  19. Typically-correct derandomizations ◮ Is 110100001101001111001010110111011010100011100 ∈ L ? ◮ If only we had some randomness...

  20. Typically-correct derandomizations ◮ Is 110100001101001111001010110111011010100011100 ∈ L ? ◮ If only we had some randomness... ◮ Let A be a randomized algorithm

  21. Typically-correct derandomizations ◮ Is 110100001101001111001010110111011010100011100 ∈ L ? ◮ If only we had some randomness... ◮ Let A be a randomized algorithm ◮ Na¨ ıve derandomization: Run A ( x , x )

  22. Typically-correct derandomizations ◮ Is 110100001101001111001010110111011010100011100 ∈ L ? ◮ If only we had some randomness... ◮ Let A be a randomized algorithm ◮ Na¨ ıve derandomization: Run A ( x , x ) ◮ Might fail on all x because of correlations between input, coins

  23. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously

  24. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity

  25. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  26. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism 2. Extract randomness from input using specialized extractor

  27. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism 2. Extract randomness from input using specialized extractor ◮ [Zimand ’08]: Sublinear time algorithms

  28. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism 2. Extract randomness from input using specialized extractor ◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC 0

  29. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism 2. Extract randomness from input using specialized extractor ◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC 0 3. Plug input into seed-extending pseudorandom generator

  30. Prior techniques for dealing with correlations 1. Find algorithm A where most random strings are good for all inputs simultaneously ◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism 2. Extract randomness from input using specialized extractor ◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC 0 3. Plug input into seed-extending pseudorandom generator ◮ [Kinne, van Melkebeek, Shaltiel ’12]: Multiparty communication protocols, BPAC 0 with symmetric gates

  31. Our technique: “Out of sight, out of mind” 110100001101001111001010110111011010100011100

  32. Our technique: “Out of sight, out of mind” ◮ Use part of the input as a source of randomness while A is processing the rest of the input 110100001101001111001010110111011010100011100 T H H T H T A H T H T H H T

  33. Our technique: “Out of sight, out of mind” ◮ Use part of the input as a source of randomness while A is processing the rest of the input 110100001101001111001010110111011010100011100 T H H T H T A H T H T H H T ◮ (Additional ideas needed to make this work...)

  34. Restriction of algorithm 110100001101001111001010110111011010100011100 ◮ Let I ⊆ [ n ]

  35. Restriction of algorithm 110100001101001111001010110111011010100011100 ◮ Let I ⊆ [ n ] ◮ Algorithm A | [ n ] \ I :

  36. Restriction of algorithm 110100001101001111001010110111011010100011100 ◮ Let I ⊆ [ n ] ◮ Algorithm A | [ n ] \ I : 1. Run A like normal...

  37. Restriction of algorithm 110100001101001111001010110111011010100011100 ◮ Let I ⊆ [ n ] ◮ Algorithm A | [ n ] \ I : 1. Run A like normal... 2. ...except, if A is about to query x i for some i ∈ I , halt immediately.

  38. Restriction of algorithm 110100001101001111001010110111011010100011100 ◮ Let I ⊆ [ n ] ◮ Algorithm A | [ n ] \ I : 1. Run A like normal... 2. ...except, if A is about to query x i for some i ∈ I , halt immediately.

  39. Main Lemma: Reducing randomness to polylog n ◮ Main Lemma :

  40. Main Lemma: Reducing randomness to polylog n ◮ Main Lemma : ◮ Suppose L ∈ BPTISP ( � O ( n ) , log n )

  41. Main Lemma: Reducing randomness to polylog n ◮ Main Lemma : ◮ Suppose L ∈ BPTISP ( � O ( n ) , log n ) ◮ There is a BPL algorithm for L that uses just polylog n random bits (one-way access)...

  42. Main Lemma: Reducing randomness to polylog n ◮ Main Lemma : ◮ Suppose L ∈ BPTISP ( � O ( n ) , log n ) ◮ There is a BPL algorithm for L that uses just polylog n random bits (one-way access)... ◮ ...that succeeds on the vast majority of inputs of each length.

Recommend


More recommend