democracy and the role of minorities in markov chain
play

Democracy and the role of minorities in Markov chain models - PowerPoint PPT Presentation

Democracy and the role of minorities in Markov chain models Non-reversible perturbations of Markov chains models Fabio Fagnani, Politecnico di Torino joint work with Giacomo Como, Lund University Lund, 19-10-2012 Perturbation of dynamical


  1. Democracy and the role of minorities in Markov chain models Non-reversible perturbations of Markov chains models Fabio Fagnani, Politecnico di Torino joint work with Giacomo Como, Lund University Lund, 19-10-2012

  2. Perturbation of dynamical networks The stability of a complex large-scale dynamical network under localized perturbations is one of the paradigmatic problem of these decades. Key issues: ◮ Correlation: understand how local perturbation affect the overall behavior. ◮ Resilience find bounds on the perturbation ’size’ which the network can tolerate. ◮ Phase transitions

  3. Perturbation of dynamical networks State of the art: ◮ Most of the results available in the literature are on connectivity issues. ◮ Analysis of how the perturbation is altering the degree distribution of the network. ◮ Degrees are in general not sufficient to study dynamics. ◮ Example: non-reversible Markov chain models.

  4. Perturbation of dynamical networks What type of perturbations: ◮ Failures in nodes or links in sensor or computer networks. Sensor with different technical properties. ◮ Heterogeneity in opinion dynamics models: minorities, leaders exhibiting a different behavior ◮ A subset of control nodes in the network... In this talk: ◮ Non-reversible perturbations of Markov chain models ◮ Applications to consensus dynamics

  5. Outline ◮ Perturbation of consensus dynamics. ◮ The general setting: perturbation of Markov chain models. ◮ An example: heterogeneous gossip model. ◮ Results on how the perturbation is affecting the asymptotics. ◮ Conclusions and open issues.

  6. Consensus dynamics G = ( V , E ) connected graph v u y v initial state (opinion) of node v Dynamics: y ( t + 1) = Py ( t ), y (0) = y P ∈ R V × V stochastic matrix on G ( P uv > 0 ⇔ ( u , v ) ∈ E ) Consensus: lim t → + ∞ ( P t y ) u = π ∗ y for all u ( ∗ means transpose) π ∈ R V + , π ∗ P = π ∗ , � u π u = 1 (invariant probability)

  7. Consensus dynamics G = ( V , E ) connected graph v u Example: (SRW) P uv = 1 d u , d u degree of node u d u Explicit expression for π : π u = 2 | E | π essentially depends on local properties of G . This holds true for general reversible Markov chains.

  8. Consensus dynamics G = ( V , E ) connected graph v u Dynamics: y ( t + 1) = Py ( t ), y (0) = y Consensus: lim t → + ∞ ( P t y ) u = π ∗ y for all u Two important parameters: ◮ the invariant probability π responsible for the asymptotics ◮ the mixing time τ responsible for the transient behavior (speed of convergence)

  9. Perturbation of consensus dynamics G = ( V , E ) connected graph w 3 v w 2 u w 1 ◮ P ∈ R V × V stochastic matrix on G ◮ Perturb P in a small set of nodes: ˜ P uv = P uv if u �∈ W = { w 1 , w 2 , w 3 } .

  10. Perturbation of consensus dynamics G = ( V , E ) connected graph w 3 v w 2 u w 1 ◮ P ∈ R V × V stochastic matrix on G ◮ Perturb P in a small set of nodes: ˜ P uv = P uv if u �∈ W = { w 1 , w 2 , w 3 } . ◮ Cut edges

  11. Perturbation of consensus dynamics G = ( V , E ) connected graph w 3 v w 2 u w 1 ◮ P ∈ R V × V stochastic matrix on G ◮ Perturb P in a small set of nodes: ˜ P uv = P uv if u �∈ W = { w 1 , w 2 , w 3 } . ◮ Cut edges. Add new edges.

  12. A heterogeneous gossip model (Acemoglu et al. 2009) G = ( V , E ), W ⊂ V a minority of influent (stubborn) agents v u ◮ At each time t choose an edge { u , v } at random. ◮ If u , v ∈ V \ W , y u ( t + 1) = y v ( t + 1) = ( x u ( t ) + x v ( t )) / 2 (reg. interaction)

  13. A heterogeneous gossip model (Acemoglu et al. 2009) G = ( V , E ), W ⊂ V a minority of influent (stubborn) agents v u ◮ At each time t choose an edge { u , v } at random. ◮ If u ∈ W , v ∈ V \ W , ◮ y u ( t + 1) = y u ( t ) , y v ( t + 1) = ε y v ( t ) + (1 − ε ) y u ( t ) with probability p (forceful interaction) ◮ y u ( t + 1) = y v ( t + 1) = ( x u ( t ) + x v ( t )) / 2 with probability 1 − p (reg. interaction)

  14. A heterogeneous gossip model (Acemoglu et al. 2009) G = ( V , E ), W ⊂ V a minority of influent (stubborn) agents v u ◮ At each time t choose an edge { u , v } at random. ◮ If u , v ∈ W nothing happens.

  15. A heterogeneous gossip model ◮ y ( t + 1) = P ( t ) y ( t ) ◮ y ( t ) converges to a consensus almost surely if p ∈ [0 , 1). But what type of consensus? ◮ If no forceful interaction is present ( p = 0), y ( t ) u → N − 1 � v y (0) v for every u . ◮ E ( P ( t )) = P p ◮ P p and P 0 only differ in the rows having index in W ∪ ∂ W .

  16. Perturbation of Markov chain models The abstract setting: ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ π || TV := 1 ◮ Study || π − ˜ � v | π v − ˜ π v | (as a function of N ) 2 π ∗ y − π ∗ y | ≤ || π − ˜ Notice that | ˜ π || TV || y || ∞ The ideal result: π ( W ) → 0 ⇒ || ˜ π − π || TV → 0

  17. A counterexample 4 3 5 2 1 n n − 1 P u , u +1 = P u , u − 1 = 1 / 2, π uniform

  18. A counterexample 4 3 5 2 1 n n − 1 P u , u +1 = P u , u − 1 = 1 / 2, π uniform π j = 2( n − j +1) P 1 , 2 = 1 , ˜ ˜ P 1 , n = 0, π 1 = 1 / n , ˜ ˜ for j ≥ 1 n 2 || π − ˜ π || TV ≍ cost .

  19. Perturbation of Markov chain models ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ If the chain mixes slowly, the process will pass many times through the perturbed set W before getting to equilibrium. ˜ π will be largely influenced by the perturbed part. Consequence: || π − ˜ π || TV �→ 0

  20. Perturbation of Markov chain models ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ A more realistic result: P mixes suff. fast , π ( W ) → 0 ⇒ ˜ π − π → 0 Recall: mixing time τ := min { t | || µ ∗ P t − π ∗ || TV ≤ 1 / e ∀ µ } SRW on d -grid with N nodes, τ ≍ N 2 / d ln N SRW on Erdos-Renji, small world, configuration model τ ≍ ln N

  21. Perturbation of Markov chain models: the literature ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ π − π || TV ≤ C τ || ˜ || ˜ P − P || 1 (Mitrophanov, 2003) To measure perturbations of P , the 1-norm is not good to treat localized perturbations: if P and ˜ P differ just in one row u and | P uv − ˜ P uv | = δ , then, || P − ˜ P || 1 ≥ δ and will not go to 0 for N → ∞ . In our context, the bound will always blow up.

  22. Perturbation of Markov chain models ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ A more realistic result: P mixes suff. fast , π ( W ) → 0 ⇒ ˜ π − π → 0 There is another problem: if P mixes rapidly, nobody guarantees that ˜ P will also do...

  23. Perturbation of Markov chain models: first result ◮ G = ( V , E ) family of connected graphs. N = | V | → + ∞ . ◮ W ⊆ V perturbation set ◮ P , ˜ P stochastic matrices on G . ˜ P uv = P uv if u �∈ W π ∗ ˜ ◮ π ∗ P = π ∗ , ˜ π ∗ . P = ˜ Theorem e 2 || ˜ π − π || TV ≤ τ ˜ π ( W ) log π ( W ) . (1) τ ˜ or, symmetrically, e 2 || ˜ π − π || TV ≤ ˜ τπ ( W ) log τπ ( W ) . (2) ˜ Proof: Coupling technique.

  24. Perturbation of Markov chain models: first result Corollary τπ ( W ) → 0 , ˜ τ = O ( τ ) ⇒ || ˜ π − π || TV → 0 τπ ( W ) → 0 , ˜ π ( W ) = O ( π ( W )) ⇒ || ˜ π − π || TV → 0 The perturbation, in order to achieve a modification of the invariant probability, necessarily has to ◮ slow down the chain ◮ increase the probability on the perturbation subset W . π and τ are intimately connected to each other!

  25. Perturbation of Markov chain models: first result Slowing down the chain and putting weight on W look quite connected to each other and essentially amounts to decrease the probability of exiting W : w ˜ P ww = 1 − 1 / N w ) − 1 = π w = E ( ˜ T + 1 π w ˜ w ) = 1 − N − 1 + N − 1 E ( T + (1 − N − 1 ) π w + N − 1 π w ∼ k k N ⇒ ˜ π w ∼ k +1

  26. A deeper analysis Lemma 1 ˜ π ( W ) ≤ , 1 + ˜ φ ∗ W τ ∗ W where π w ˜ � � ˜ P wv ˜ τ ∗ φ ∗ w ∈ W v ∈ V \ W W := min { E v [ T W ] : v ∈ V \ W } , W := π ( W ) ˜ minimum entrance time to W bottleneck ratio of W it depends on ˜ it depends on P P Proof From Kac’s lemma π w ˜ π ( W ) − 1 = E ˜ � � P wv E v [ T W ] ≥ 1+˜ ˜ π W [ T + φ ∗ W τ ∗ ˜ W ] = 1+ W . π ( W ) ˜ w v

  27. A deeper analysis ◮ bottleneck ratio ← → exit probability from W : P w ( T V \ W ≤ d ) ≥ α ∀ w ∈ W ⇒ ˜ φ ∗ W ≥ d /α If P w ( T V \ W ≤ d ) ≥ α for fixed d , α > 0 and for every w ∈ W , then � � 1 τ W ≍ ( τ ∗ W ) − 1 , π ( W ) ≤ ˜ τ ˜ π ( W ) = O τ ∗ 1+˜ φ ∗ W τ ∗ W Hence, τ → 0 ⇒ || ˜ π − π || TV → 0 τ ∗ W

Recommend


More recommend