markov chains mixing time and gerrymandering what is a
play

Markov Chains, Mixing Time, and Gerrymandering What is a Markov - PowerPoint PPT Presentation

Markov Chains, Mixing Time, and Gerrymandering What is a Markov chain? Definition (Event Space) An event space is a collection of events . Definition (Random Variable) A random variable X is some mapping X : R , where X (


  1. Markov Chains, Mixing Time, and Gerrymandering

  2. What is a Markov chain? Definition (Event Space) An event space Ω is a collection of events Σ. Definition (Random Variable) A random variable X is some mapping X : Ω → R , where X ( ω ∈ Ω) represents the value of some outcome in Ω. Example Say we have a fair coin c . Then, we can define X to be � 0 ω = heads X ( ω ) = 1 ω = tails

  3. What is a Markov chain? Example (Weather)

  4. What is a Markov chain? Definition (Markov property) P ( X n +1 = σ | X 1 = σ 1 , . . . , X n = σ n ) = P ( X n +1 = σ | X n = σ n ) Definition (Markov chain) Suppose we have some random process R = ( X 0 , X 1 , . . . , X n ). Then, a Markov chain is R equipped with the Markov property.

  5. What is a Markov chain? Definition (Transition Matrix) Given some Markov chain M = ( X 0 , X 1 , . . . , X n ), its transition matrix P can be defined as P i,j = P ( X n +1 = j | X n = i ) Definition (Reversibility) A Markov chain M = ( X 0 , X 1 , . . . , X n ) is considered reversible if, given some probability distribution π , the following holds: π i · P ( X n +1 = j | X n = i ) = π j · P ( X n +1 = i | X n = j )

  6. What is a Markov chain? Definition (Stationary Distribution) A Markov chain M has reached a stationary distribution π if, for transition matrix P , π = π · P

  7. What is a Markov chain? Example (Weather) � 0 . 9 0 . 1 � P = 0 . 5 0 . 5

  8. Pegden et al. Theorem (1.1) Let M = ( X 0 , X 1 , . . . ) be a reversible Markov chain with stationary distribution π , and suppose the states of M have real-valued labels. If X 0 ∼ π , then for any fixed k , the probability that the label of X 0 is an ǫ -outlier from among the list of labels √ observed in the trajectory X 0 , X 1 , X 2 , . . . , X k is, at most, 2 ǫ .

  9. Pegden et al. A bit of clarity... Assume that M has some stationary distribution π , and that, if we start at some distribution X 0 , we’ll eventually get to the stationary distribution, denoted as X 0 ∼ π Then, pick some ǫ . If we have some labeling function ω : Ω → R , such that each state in M has some real-valued label, the probability that X 0 ’s real-valued label, ω ( X 0 ), is √ ǫ ( k + 1)-far away from ω ( X 1 , X 2 , . . . , X n ) is 2 ǫ .

  10. Pegden et al. A few conclusions arise: (1) If ω ( X 0 ) is a most extreme outlier, then the chain M will √ not approach its stationary distribution with p ≤ 2 ǫ . (2) Given (1), consider Ω to be a set of districting plans. Then, define some scoring metric for these plans, called R : Ω → R , which acts as our labeling function. Next, consider some districting plan D = X 0 . If R ( D ) is an outlier, then the chain M cannot approach its stationary distribution. As such, D may be considered a ”gerrymander.”

  11. Conclusion References: (1) Assessing significance in a Markov chain without mixing , Pegden et al. (2) VRDI Intro. (3) Finite Markov Chains and Algorithmic Applications , Olle Haggstrom (London Mathematical Society).

Recommend


More recommend