reconfiguration from statistical physics to graph theory
play

Reconfiguration: From statistical physics to graph theory Nicolas - PowerPoint PPT Presentation

Reconfiguration: From statistical physics to graph theory Nicolas Bousquet Joint works with Marthe Bonamy, Carl Feghali, Matthew Johnson, Guillem Perarnau. Journ ees Structures Discr` etes ENS Lyon 1/22 Spin systems Spin is one of two


  1. Reconfiguration: From statistical physics to graph theory Nicolas Bousquet Joint works with Marthe Bonamy, Carl Feghali, Matthew Johnson, Guillem Perarnau. Journ´ ees Structures Discr` etes ENS Lyon 1/22

  2. Spin systems Spin is one of two types of angular mo- mentum in quantum mechanic. [...] In some ways, spin is like a vector quan- tity ; it has a definite magnitude, and it has a “direction”. 2/22

  3. Spin systems Spin is one of two types of angular mo- mentum in quantum mechanic. [...] In some ways, spin is like a vector quan- tity ; it has a definite magnitude, and it has a “direction”. Usually, spins take their value in { + , −} , but sometimes the range is larger... 2/22

  4. Spin systems Spin is one of two types of angular mo- mentum in quantum mechanic. [...] In some ways, spin is like a vector quan- tity ; it has a definite magnitude, and it has a “direction”. Usually, spins take their value in { + , −} , but sometimes the range is larger... A spin configuration is a function σ : S → { 1 , . . . , k } . • Interactions between spins in S are modelized via an interaction matrix. • If coefficients are in 0 − 1 : representation with a graph : • 0 = no interaction = no link. • 1 = interaction = link. 2/22

  5. Spin systems Spin is one of two types of angular mo- mentum in quantum mechanic. [...] In some ways, spin is like a vector quan- tity ; it has a definite magnitude, and it has a “direction”. Usually, spins take their value in { + , −} , but sometimes the range is larger... A spin configuration is a function σ : S → { 1 , . . . , k } . • Interactions between spins in S are modelized via an interaction matrix. • If coefficients are in 0 − 1 : representation with a graph : • 0 = no interaction = no link. • 1 = interaction = link. Spin configuration ⇒ (non necessarily proper) graph coloring. 2/22

  6. Antiferromagnetic Potts model H ( σ ) : number of monochromatic edges. = Edges with both endpoints of the same color. 0 2 4 Gibbs measure at fixed temperature T : T = 5 , 1 , 0 . 2 , 0 . 05 ν T ( σ ) = e − H ( σ ) T 3/22

  7. Antiferromagnetic Potts model H ( σ ) : number of monochromatic edges. = Edges with both endpoints of the same color. 0 2 4 Gibbs measure at fixed temperature T : T = 5 , 1 , 0 . 2 , 0 . 05 ν T ( σ ) = e − H ( σ ) T Important points to notice : • Free to rescale, ν T = probability distribution P on the colorings. 3/22

  8. Antiferromagnetic Potts model H ( σ ) : number of monochromatic edges. = Edges with both endpoints of the same color. 0 2 4 Gibbs measure at fixed temperature T : T = 5 , 1 , 0 . 2 , 0 . 05 ν T ( σ ) = e − H ( σ ) T Important points to notice : • Free to rescale, ν T = probability distribution P on the colorings. • The probability ց if the number of monochrom. edges ր . 3/22

  9. Antiferromagnetic Potts model H ( σ ) : number of monochromatic edges. = Edges with both endpoints of the same color. 0 2 4 Gibbs measure at fixed temperature T : T = 5 , 1 , 0 . 2 , 0 . 05 ν T ( σ ) = e − H ( σ ) T Important points to notice : • Free to rescale, ν T = probability distribution P on the colorings. • The probability ց if the number of monochrom. edges ր . • When T ց , P ( c ) ց if c has at least one monochr. edge. 3/22

  10. Antiferromagnetic Potts model H ( σ ) : number of monochromatic edges. = Edges with both endpoints of the same color. 0 2 4 Gibbs measure at fixed temperature T : T = 5 , 1 , 0 . 2 , 0 . 05 ν T ( σ ) = e − H ( σ ) T Important points to notice : • Free to rescale, ν T = probability distribution P on the colorings. • The probability ց if the number of monochrom. edges ր . • When T ց , P ( c ) ց if c has at least one monochr. edge. Definition (Glauber dynamics) Limit of a k -state Potts model when T → 0. ⇔ Only proper colorings have positive measure. 3/22

  11. Sampling spin configurations In the statistical physics community, the following Monte Carlo Markov chain was proposed to sample a configuration : • Start with an initial coloring c ; • Choose a vertex v at random and a color a ; • Recolor v with color a if the resulting coloring is proper otherwise do nothing ; • Repeat 4/22

  12. Sampling spin configurations In the statistical physics community, the following Monte Carlo Markov chain was proposed to sample a configuration : • Start with an initial coloring c ; • Choose a vertex v at random and a color a ; • Recolor v with color a if the resulting coloring is proper otherwise do nothing ; • Repeat Questions : • Can we generate any solution ? • How much time do we need to “sample a solution almost at random” ? 4/22

  13. Reconfiguration graph Definition ( k -Reconfiguration graph C k ( G ) of G ) • Vertices : Proper k -colorings of G . • Create an edge between any two k -colorings which differ on exactly one vertex. All along the talk k denotes the number of colors. 5/22

  14. Reconfiguration graph Definition ( k -Reconfiguration graph C k ( G ) of G ) • Vertices : Proper k -colorings of G . • Create an edge between any two k -colorings which differ on exactly one vertex. All along the talk k denotes the number of colors. Remark 1. Two colorings equivalent up to color permutation are distinct. � = 5/22

  15. Reconfiguration graph Definition ( k -Reconfiguration graph C k ( G ) of G ) • Vertices : Proper k -colorings of G . • Create an edge between any two k -colorings which differ on exactly one vertex. All along the talk k denotes the number of colors. Remark 1. Two colorings equivalent up to color permutation are distinct. � = Remark 2. All the k -colorings can be generated ⇔ The k -reconfiguration graph is connected. 5/22

  16. Convergence of Markov chains A Markov chain is irreducible if any solution can be reached from any other. ⇔ The reconfiguration graph is connected. A chain is aperiodic if there exists t 0 such that Pr ( X t = a ) is positive for every t > t 0 and every state a . 6/22

  17. Convergence of Markov chains A Markov chain is irreducible if any solution can be reached from any other. ⇔ The reconfiguration graph is connected. A chain is aperiodic if there exists t 0 such that Pr ( X t = a ) is positive for every t > t 0 and every state a . Theorem Every ergodic (aperiodic and irreducible) Markov chain converges to a unique stationnary distribution. 6/22

  18. Convergence of Markov chains A Markov chain is irreducible if any solution can be reached from any other. ⇔ The reconfiguration graph is connected. A chain is aperiodic if there exists t 0 such that Pr ( X t = a ) is positive for every t > t 0 and every state a . Theorem Every ergodic (aperiodic and irreducible) Markov chain converges to a unique stationnary distribution. In our case : • P ( X t +1 = X t ) > 0 ⇒ Irreducibility implies aperiodicity. 6/22

  19. Convergence of Markov chains A Markov chain is irreducible if any solution can be reached from any other. ⇔ The reconfiguration graph is connected. A chain is aperiodic if there exists t 0 such that Pr ( X t = a ) is positive for every t > t 0 and every state a . Theorem Every ergodic (aperiodic and irreducible) Markov chain converges to a unique stationnary distribution. In our case : • P ( X t +1 = X t ) > 0 ⇒ Irreducibility implies aperiodicity. • All the transitions have the same probability ⇒ the stationnary distribution is uniform. 6/22

  20. Convergence of Markov chains A Markov chain is irreducible if any solution can be reached from any other. ⇔ The reconfiguration graph is connected. A chain is aperiodic if there exists t 0 such that Pr ( X t = a ) is positive for every t > t 0 and every state a . Theorem Every ergodic (aperiodic and irreducible) Markov chain converges to a unique stationnary distribution. In our case : • P ( X t +1 = X t ) > 0 ⇒ Irreducibility implies aperiodicity. • All the transitions have the same probability ⇒ the stationnary distribution is uniform. Question : How much time do we need to converge ? 6/22

  21. Mixing time Mixing time = number of steps we need to be sure that we are “close” to the stationnary distribution. ⇔ Number of steps needed to guarantee that the solutions is sampled “almost” at random. 7/22

  22. Mixing time Mixing time = number of steps we need to be sure that we are “close” to the stationnary distribution. ⇔ Number of steps needed to guarantee that the solutions is sampled “almost” at random. A chain is rapidly mixing if its mixing time is polynomial (and even better O ( n log n ). 7/22

  23. Mixing time Mixing time = number of steps we need to be sure that we are “close” to the stationnary distribution. ⇔ Number of steps needed to guarantee that the solutions is sampled “almost” at random. A chain is rapidly mixing if its mixing time is polynomial (and even better O ( n log n ). Mixing time and Reconfiguration graph ? • Diameter of the Reconfiguration graph = D ⇒ Mixing time ≥ 2 · D . 7/22

  24. Mixing time Mixing time = number of steps we need to be sure that we are “close” to the stationnary distribution. ⇔ Number of steps needed to guarantee that the solutions is sampled “almost” at random. A chain is rapidly mixing if its mixing time is polynomial (and even better O ( n log n ). Mixing time and Reconfiguration graph ? • Diameter of the Reconfiguration graph = D ⇒ Mixing time ≥ 2 · D . • Better lower bounds ? Look at the connectivity of the reconfiguration graph. 7/22

Recommend


More recommend