the prokofiev svistunov ising process is rapidly mixing
play

The Prokofiev-Svistunov-Ising process is rapidly mixing Tim Garoni - PowerPoint PPT Presentation

Introduction Main Theorem Proof Discussion The Prokofiev-Svistunov-Ising process is rapidly mixing Tim Garoni School of Mathematical Sciences Monash University Introduction Main Theorem Proof Discussion Collaborators Andrea


  1. Introduction Main Theorem Proof Discussion Exact solutions ◮ G n = Z n solved by Ising (1925) ◮ G n = K n solved by Husimi (1953) and Temperley (1954) ◮ G n = Z 2 n solved by Peierls (1936), Kramers & Wannier (1941), Onsager (1944), Yang (1952), Kasteleyn (1963), Fisher (1966) Smirnov (2006) proved critical interfaces between + / − components have conformally invariant limit ◮ SLE(3) - Schramm-L¨ owner Evolution K-F related Ising partition function to perfect matchings ◮ Elegant solution on planar graphs in terms of Pfaffians ◮ Only tractable on graphs of bounded (small) genus ◮ Method not tractable for G n = Z d n with d ≥ 3

  2. Introduction Main Theorem Proof Discussion Computational Complexity ◮ P ARTITION : ◮ Input: Finite graph G = ( V, E ) , and parameters β , h ◮ Output: Ising partition function ◮ C ORRELATION : ◮ Input: Finite graph G = ( V, E ) , a pair u, v ∈ V , and parameters β , h ◮ Output: Ising two-point correlation function ◮ S USCEPTIBILITY : ◮ Input: Finite graph G = ( V, E ) , and parameters β , h ◮ Output: Ising susceptibility

  3. Introduction Main Theorem Proof Discussion Computational Complexity ◮ P ARTITION : ◮ Input: Finite graph G = ( V, E ) , and parameters β , h ◮ Output: Ising partition function ◮ C ORRELATION : ◮ Input: Finite graph G = ( V, E ) , a pair u, v ∈ V , and parameters β , h ◮ Output: Ising two-point correlation function ◮ S USCEPTIBILITY : ◮ Input: Finite graph G = ( V, E ) , and parameters β , h ◮ Output: Ising susceptibility Proposition ( Jerrum-Sinclair 1993; Sinclair-Srivastava 2014 ) P ARTITION , S USCEPTIBILITY and C ORRELATION are #P-hard.

  4. Introduction Main Theorem Proof Discussion Markov-chain Monte Carlo ◮ Construct a transition matrix P on Ω which: ◮ Is ergodic ◮ Has stationary distribution π ( · ) ◮ Generate random samples with (approximate) distribution π ◮ Estimate π expectations using sample means

  5. Introduction Main Theorem Proof Discussion Markov-chain Monte Carlo ◮ Construct a transition matrix P on Ω which: ◮ Is ergodic ◮ Has stationary distribution π ( · ) ◮ Generate random samples with (approximate) distribution π ◮ Estimate π expectations using sample means x ∈ Ω � P t ( x, · ) − π ( · ) � ≤ Cα t , d ( t ) := max for α ∈ (0 , 1) ◮ Mixing time quantifies the rate of convergence t mix ( δ ) := min { t : d ( t ) ≤ δ }

  6. Introduction Main Theorem Proof Discussion Markov-chain Monte Carlo ◮ Construct a transition matrix P on Ω which: ◮ Is ergodic ◮ Has stationary distribution π ( · ) ◮ Generate random samples with (approximate) distribution π ◮ Estimate π expectations using sample means x ∈ Ω � P t ( x, · ) − π ( · ) � ≤ Cα t , d ( t ) := max for α ∈ (0 , 1) ◮ Mixing time quantifies the rate of convergence t mix ( δ ) := min { t : d ( t ) ≤ δ } ◮ How does t mix depend on size of Ω ?

  7. Introduction Main Theorem Proof Discussion Markov-chain Monte Carlo ◮ Construct a transition matrix P on Ω which: ◮ Is ergodic ◮ Has stationary distribution π ( · ) ◮ Generate random samples with (approximate) distribution π ◮ Estimate π expectations using sample means x ∈ Ω � P t ( x, · ) − π ( · ) � ≤ Cα t , d ( t ) := max for α ∈ (0 , 1) ◮ Mixing time quantifies the rate of convergence t mix ( δ ) := min { t : d ( t ) ≤ δ } ◮ How does t mix depend on size of Ω ? ◮ For Ising the size of a problem instance is n = | V | ◮ If t mix = O ( poly ( n )) we have rapid mixing

  8. Introduction Main Theorem Proof Discussion Markov-chain Monte Carlo ◮ Construct a transition matrix P on Ω which: ◮ Is ergodic ◮ Has stationary distribution π ( · ) ◮ Generate random samples with (approximate) distribution π ◮ Estimate π expectations using sample means x ∈ Ω � P t ( x, · ) − π ( · ) � ≤ Cα t , d ( t ) := max for α ∈ (0 , 1) ◮ Mixing time quantifies the rate of convergence t mix ( δ ) := min { t : d ( t ) ≤ δ } ◮ How does t mix depend on size of Ω ? ◮ For Ising the size of a problem instance is n = | V | ◮ If t mix = O ( poly ( n )) we have rapid mixing ◮ | Ω | = 2 n so rapid mixing implies only logarithmically-many states need be visited to reach approximate stationarity

  9. Introduction Main Theorem Proof Discussion Markov chains for the Ising model ◮ Glauber process (arbitrary field) 1963 ◮ Lubetzky & Sly (2012): Rapidly mixing on boxes in Z 2 at h = 0 iff β ≤ β c ◮ Levin, Luczak & Peres (2010): Precise asymptotics on K n for h = 0 ◮ Not used by computational physicists

  10. Introduction Main Theorem Proof Discussion Markov chains for the Ising model ◮ Glauber process (arbitrary field) 1963 ◮ Lubetzky & Sly (2012): Rapidly mixing on boxes in Z 2 at h = 0 iff β ≤ β c ◮ Levin, Luczak & Peres (2010): Precise asymptotics on K n for h = 0 ◮ Not used by computational physicists ◮ Swendsen-Wang process (zero field) 1987 ◮ Simulates coupling of Ising and Fortuin-Kasteleyn models ◮ Long, Nachmias, Ding & Peres (2014): Precise asymptotics on K n ◮ Ullrich (2014): Rapidly mixing on boxes in Z 2 at all β � = β c ◮ Empirically fast. State of the art 1987 – recently

  11. Introduction Main Theorem Proof Discussion Markov chains for the Ising model ◮ Glauber process (arbitrary field) 1963 ◮ Lubetzky & Sly (2012): Rapidly mixing on boxes in Z 2 at h = 0 iff β ≤ β c ◮ Levin, Luczak & Peres (2010): Precise asymptotics on K n for h = 0 ◮ Not used by computational physicists ◮ Swendsen-Wang process (zero field) 1987 ◮ Simulates coupling of Ising and Fortuin-Kasteleyn models ◮ Long, Nachmias, Ding & Peres (2014): Precise asymptotics on K n ◮ Ullrich (2014): Rapidly mixing on boxes in Z 2 at all β � = β c ◮ Empirically fast. State of the art 1987 – recently ◮ Jerrum-Sinclair process (positive field) 1993 ◮ Simulates high-temperature graphs for h > 0 ◮ Proved rapidly mixing on all graphs at all temperatures for all h > 0 ◮ No empirical results - not used by computational physicists

  12. Introduction Main Theorem Proof Discussion Markov chains for the Ising model ◮ Glauber process (arbitrary field) 1963 ◮ Lubetzky & Sly (2012): Rapidly mixing on boxes in Z 2 at h = 0 iff β ≤ β c ◮ Levin, Luczak & Peres (2010): Precise asymptotics on K n for h = 0 ◮ Not used by computational physicists ◮ Swendsen-Wang process (zero field) 1987 ◮ Simulates coupling of Ising and Fortuin-Kasteleyn models ◮ Long, Nachmias, Ding & Peres (2014): Precise asymptotics on K n ◮ Ullrich (2014): Rapidly mixing on boxes in Z 2 at all β � = β c ◮ Empirically fast. State of the art 1987 – recently ◮ Jerrum-Sinclair process (positive field) 1993 ◮ Simulates high-temperature graphs for h > 0 ◮ Proved rapidly mixing on all graphs at all temperatures for all h > 0 ◮ No empirical results - not used by computational physicists ◮ Prokofiev-Svistunov worm process (zero field) 2001 ◮ No rigorous results currently known ◮ Empirically, best method known for susceptibility (Deng, G., Sokal) ◮ Widely used by computational physicists

  13. Introduction Main Theorem Proof Discussion Mixing time bound for PS process Theorem (Collevecchio, G., Hyndman, Tokarev 2014+) For any temperature, the mixing time of the PS process on graph G = ( V, E ) satisfies t mix ( δ ) = O (∆( G ) m 2 n 5 ) with n = | V | , m = | E | and ∆( G ) the maximum degree.

  14. Introduction Main Theorem Proof Discussion Mixing time bound for PS process Theorem (Collevecchio, G., Hyndman, Tokarev 2014+) For any temperature, the mixing time of the PS process on graph G = ( V, E ) satisfies t mix ( δ ) = O (∆( G ) m 2 n 5 ) with n = | V | , m = | E | and ∆( G ) the maximum degree. Only Markov chain for the Ising model currently known to be rapidly mixing at the critical point for boxes in Z d

  15. Introduction Main Theorem Proof Discussion High-temperature expansions and the PS measure ◮ Let C k = { A ⊆ E : ( V, A ) has k odd vertices }

  16. Introduction Main Theorem Proof Discussion High-temperature expansions and the PS measure ◮ Let C k = { A ⊆ E : ( V, A ) has k odd vertices } ◮ Let C W = { A ⊆ E : W is the set of odd vertices in ( V, A ) }

  17. Introduction Main Theorem Proof Discussion High-temperature expansions and the PS measure ◮ Let C k = { A ⊆ E : ( V, A ) has k odd vertices } ◮ Let C W = { A ⊆ E : W is the set of odd vertices in ( V, A ) } ◮ PS measure defined on the configuration space C 0 ∪ C 2 � n, A ∈ C 0 , π ( A ) ∝ x | A | 2 , A ∈ C 2 .

  18. Introduction Main Theorem Proof Discussion High-temperature expansions and the PS measure ◮ Let C k = { A ⊆ E : ( V, A ) has k odd vertices } ◮ Let C W = { A ⊆ E : W is the set of odd vertices in ( V, A ) } ◮ PS measure defined on the configuration space C 0 ∪ C 2 � n, A ∈ C 0 , π ( A ) ∝ x | A | 2 , A ∈ C 2 . ◮ If x = tanh β then: 1 ◮ Ising susceptibility χ = π ( C 0 ) ◮ Ising two-point correlation function E ( σ u σ v ) = n π ( C uv ) 2 π ( C 0 )

  19. Introduction Main Theorem Proof Discussion High-temperature expansions and the PS measure ◮ Let C k = { A ⊆ E : ( V, A ) has k odd vertices } ◮ Let C W = { A ⊆ E : W is the set of odd vertices in ( V, A ) } ◮ PS measure defined on the configuration space C 0 ∪ C 2 � n, A ∈ C 0 , π ( A ) ∝ x | A | 2 , A ∈ C 2 . ◮ If x = tanh β then: 1 ◮ Ising susceptibility χ = π ( C 0 ) ◮ Ising two-point correlation function E ( σ u σ v ) = n π ( C uv ) 2 π ( C 0 ) ◮ PS measure is stationary distribution of PS process

  20. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 .

  21. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993):

  22. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n )

  23. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) :

  24. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) : ◮ Let R ( G ) be our upper bound for t mix ( δ ) with δ = ξ/ [8 S ( n )]

  25. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) : ◮ Let R ( G ) be our upper bound for t mix ( δ ) with δ = ξ/ [8 S ( n )] ◮ Let Y = 1 A

  26. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) : ◮ Let R ( G ) be our upper bound for t mix ( δ ) with δ = ξ/ [8 S ( n )] ◮ Let Y = 1 A ◮ Run the PS process T = R ( G ) time steps and record Y T

  27. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) : ◮ Let R ( G ) be our upper bound for t mix ( δ ) with δ = ξ/ [8 S ( n )] ◮ Let Y = 1 A ◮ Run the PS process T = R ( G ) time steps and record Y T ◮ Independently generate 72 ξ − 2 S ( n ) such samples and take the sample mean

  28. Introduction Main Theorem Proof Discussion Fully-polynomial randomized approximation schemes Definition An fpras for an Ising property f is a randomized algorithm such that for given G , T , and ξ, η ∈ (0 , 1) the output Y satisfies P [(1 − ξ ) f ≤ Y ≤ (1 + ξ ) f ] ≥ 1 − η and the running time is bounded by a polynomial in n, ξ − 1 , η − 1 . Combine rapid mixing of PS process with general fpras construction of Jerrum-Sinclair (1993): ◮ Let A ⊆ C 0 ∪ C 2 with π ( A ) ≥ 1 /S ( n ) for some polynomial S ( n ) ◮ The following defines an fpras for π ( A ) : ◮ Let R ( G ) be our upper bound for t mix ( δ ) with δ = ξ/ [8 S ( n )] ◮ Let Y = 1 A ◮ Run the PS process T = R ( G ) time steps and record Y T ◮ Independently generate 72 ξ − 2 S ( n ) such samples and take the sample mean ◮ Repeat 6 lg ⌈ 1 /η ⌉ + 1 such experiments and take the median

  29. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals:

  30. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 :

  31. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V

  32. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V

  33. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u

  34. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u

  35. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  36. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  37. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 :

  38. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V

  39. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V

  40. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u

  41. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u

  42. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  43. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  44. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  45. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  46. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  47. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  48. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  49. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  50. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  51. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  52. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  53. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv

  54. Introduction Main Theorem Proof Discussion Prokofiev-Svistunov process PS proposals: ◮ If A ∈ C 0 : ◮ Pick uniformly random u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv ◮ If A ∈ C 2 : ◮ Pick random odd u ∈ V ◮ Pick uniformly random v ∼ u ◮ Propose A → A △ uv Metropolize proposals with respect to PS measure π ( · )

  55. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 }

  56. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 } ◮ Specify paths γ I,F in G between pairs of states I, F

  57. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 } ◮ Specify paths γ I,F in G between pairs of states I, F ◮ If no transition is used too often, the process is rapidly mixing

  58. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 } ◮ Specify paths γ I,F in G between pairs of states I, F ◮ If no transition is used too often, the process is rapidly mixing ◮ In the most general case, one specifies paths between each pair I, F ∈ C 0 ∪ C 2

  59. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 } ◮ Specify paths γ I,F in G between pairs of states I, F ◮ If no transition is used too often, the process is rapidly mixing ◮ In the most general case, one specifies paths between each pair I, F ∈ C 0 ∪ C 2 ◮ For PS process, convenient to only specify paths from C 2 to C 0 C 2 C 0

  60. Introduction Main Theorem Proof Discussion Proof of rapid mixing ◮ We use the path method ◮ Consider the transition graph G = ( V , E ) of the PS process ◮ V = C 0 ∪ C 2 ◮ E = { AA ′ : P ( A, A ′ ) > 0 } ◮ Specify paths γ I,F in G between pairs of states I, F ◮ If no transition is used too often, the process is rapidly mixing ◮ In the most general case, one specifies paths between each pair I, F ∈ C 0 ∪ C 2 ◮ For PS process, convenient to only specify paths from C 2 to C 0 I C 2 C 0 F

  61. Introduction Main Theorem Proof Discussion Lemma (Jerrum-Sinclair-Vigoda (2004)) Consider MC with state space Ω and stationary distribution π . Let S ⊂ Ω , and specify paths Γ = { γ I,F : I ∈ S , F ∈ S c } . Then � π ( S )   π ( S c ) + π ( S c ) 1 � �� t mix ( δ ) ≤ log 2 + 4 ϕ (Γ)   δ min A ∈ Ω π ( A ) π ( S )   where   � �   π ( I ) π ( F )   � ϕ (Γ) := ( I,F ) ∈S×S c | γ I,F | max max π ( A ) P ( A, A ′ ) AA ′ ∈E  ( I,F ) ∈S×S c     γI,F ∋ AA ′ 

  62. Introduction Main Theorem Proof Discussion Lemma (Jerrum-Sinclair-Vigoda (2004)) Consider MC with state space Ω and stationary distribution π . Let S ⊂ Ω , and specify paths Γ = { γ I,F : I ∈ S , F ∈ S c } . Then � π ( S )   π ( S c ) + π ( S c ) 1 � �� t mix ( δ ) ≤ log 2 + 4 ϕ (Γ)   δ min A ∈ Ω π ( A ) π ( S )   where   � �   π ( I ) π ( F )   � ϕ (Γ) := ( I,F ) ∈S×S c | γ I,F | max max π ( A ) P ( A, A ′ ) AA ′ ∈E  ( I,F ) ∈S×S c     γI,F ∋ AA ′  ◮ We choose S = C 2 .

  63. Introduction Main Theorem Proof Discussion Lemma (Jerrum-Sinclair-Vigoda (2004)) Consider MC with state space Ω and stationary distribution π . Let S ⊂ Ω , and specify paths Γ = { γ I,F : I ∈ S , F ∈ S c } . Then � π ( S )   π ( S c ) + π ( S c ) 1 � �� t mix ( δ ) ≤ log 2 + 4 ϕ (Γ)   δ min A ∈ Ω π ( A ) π ( S )   where   � �   π ( I ) π ( F )   � ϕ (Γ) := ( I,F ) ∈S×S c | γ I,F | max max π ( A ) P ( A, A ′ ) AA ′ ∈E  ( I,F ) ∈S×S c     γI,F ∋ AA ′  ◮ We choose S = C 2 . Elementary to show: 2 mx + 1 ≤ π ( C 2 ) mx � x � m π ( C 0 ) ≤ n − 1 , π ( A ) ≥ for all A and n 8

  64. Introduction Main Theorem Proof Discussion Lemma (Jerrum-Sinclair-Vigoda (2004)) Consider MC with state space Ω and stationary distribution π . Let S ⊂ Ω , and specify paths Γ = { γ I,F : I ∈ S , F ∈ S c } . Then � π ( S )   π ( S c ) + π ( S c ) 1 � �� t mix ( δ ) ≤ log 2 + 4 ϕ (Γ)   δ min A ∈ Ω π ( A ) π ( S )   where   � �   π ( I ) π ( F )   � ϕ (Γ) := ( I,F ) ∈S×S c | γ I,F | max max π ( A ) P ( A, A ′ ) AA ′ ∈E  ( I,F ) ∈S×S c     γI,F ∋ AA ′  ◮ We choose S = C 2 . Elementary to show: 2 mx + 1 ≤ π ( C 2 ) mx � x � m π ( C 0 ) ≤ n − 1 , π ( A ) ≥ for all A and n 8 ◮ Therefore: � 8 � � − log δ � � 1 � t mix ( δ ) ≤ log 3 + 2 m n ϕ (Γ) x m mx

  65. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F I F I △ F

  66. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 I F I △ F

  67. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles I △ F

  68. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles A 0

  69. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles A 1

  70. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles A 2

  71. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . . A 2

  72. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . . I

  73. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  74. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  75. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  76. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  77. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  78. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

  79. Introduction Main Theorem Proof Discussion Choice of Canonical Paths ◮ To transition from I to F ◮ Flip each e ∈ I △ F ◮ If ( I, F ) ∈ C 2 × C 0 then I △ F ∈ C 2 �� � ◮ I △ F = A 0 ∪ i ≥ 1 A i I F ◮ A 0 is a path ◮ A i disjoint cycles ◮ γ I,F defined by: ◮ Traverse A 0 ◮ . . . then A 1 ◮ . . . then A 2 ◮ . . .

Recommend


More recommend