evolutionary graph theory
play

Evolutionary Graph Theory J. D az LSI-UPC Nice, May, 2014 - PowerPoint PPT Presentation

Evolutionary Graph Theory J. D az LSI-UPC Nice, May, 2014 Population Genetics Models Model the forces that produce and maintain genetic evolution within a population. Mutation: the process by which one individual (gene) changes.


  1. Evolutionary Graph Theory J. D´ ıaz LSI-UPC Nice, May, 2014

  2. Population Genetics Models Model the forces that produce and maintain genetic evolution within a population. Mutation: the process by which one individual (gene) changes. Simulation wants to study the drift of the population: how the frequency of mutants in the total population evolves. The Moran Process P. Moran: Random processes in genetics Cambridge Ph. Soc. 1958 • Start with n individuals. Randomly select one to mutate. • Select randomly an individual x to replicate. • Select randomly another y to die. • Replace y by a clone of x . Stochastic process. At time t the number mutants evolves in {− 1 , 0 , +1 } .

  3. Evolutionary graph theory (EGT) Lieberman, Hauert, Nowak: Evolutionary dynamics on graphs Nature 2005 (LHN) EGT studies how the topology of interactions between the population affects evolution. Graphs have two types of vertices: mutants and non-mutants. The fitness r of an agent denotes its reproductive rate. Mutants have fitness r ∈ Θ(1), non-mutants have fitness 1. Mutants and non-mutants extend by cloning one of their neighbors.

  4. Moran process on Evolutionary Graphs Given a graph G = ( V , E ), with | V | = n , and an r > 0, we start with all vertices non-mutant. • at t = 0 create uniformly at random a mutant in V At any time t > 0, assume we have k mutant and ( n − k ) non-mutant vertices. Define total fitness at time t by W t = kr + ( n − k ): r 1 • Choose u with probability W t if u is mutant and W t otherwise, • choose uniformly at random a v ∈ N ( u ), and replace v with the clone of u The process is Markovian, depending on r it tends to one of the two absorbing states: extinction or fixation.

  5. Example of Moran process q bc where: p 1 b p 1 3+ r · 1 r b = 2 p 2 3+ r · 1 r b = a b 2 p 2 b 2+2 r · 5 1 q ab = 6 q ab d c ( n − 1)+ r · 5 1 q bc = 6 3+ r · 1 2 q b = q b 3

  6. Moran Process This random process defines discrete, transient Markov chain, on states { 0 , 1 , . . . , n − 1 , n } with two absorbing states: n fixation (all mutant) and 0 extinction (all non-mutant). 1 s 1 s 2 s 3 1 p 1 p 2 p 3 0 1 2 3 4 q 1 q 2 q 3 Absorving states The fixation probability f G ( r ) of G is the probability that a single mutant will takes over the whole G . The extinction probability of G is 1 − f G ( r ).

  7. The Markov chain of configurations A configuration is a set S ⊆ V of mutants. a b ab a abc d c ac b cda ad ∅ V c dab bc d bcd cd bd 3 4 0 1 2

  8. Properties of f G ( r ) Given G = ( V , E ) connected and a fitness r > 0, for any S ⊂ V let f G , r ( S ) denote the fixation probability, when starting with a set S of mutants. Notice f G ( r ) = � v ∈ V f G , r ( { v } ). The case r = 1 is denoted neutral drift. Shakarian, Ross, Johnson, Biosystems 2012 For any r ≥ 1, f G ( r ) ≥ f G (1) D´ ıaz,Goldberg,Mertzios,Richerby,Serna,Spirakis, SODA-2012 (DGMRSS) For any undirected G = ( V , E ), f G (1) = 1 n .

  9. Bounding f G ( r ) Let G = ( V , E ) be any undirected connected graph, with | V | = n . (DGMRSS) For any r ≥ 1, 1 1 n ≤ f G ( r ) ≤ 1 − n + r , are bounds on the fixation probability for G . Merzios, Spirakis: ArXive-2014 For any ǫ > 0, 1 f G ( r ) ≤ 1 − 4 + ǫ . 3 n Open problem: There are not known upper bounds that don’t depend on n . Conjecture: f G ( r ) ≤ 1 − 1 r

  10. Questions to study Given a connected graph G = ( V , E ) (strongly connected is case of digraphs), and a fitness r : 1.- Is it possible to compute exactly the fixation probability f G ( r ) ? Difficult for some graphs. For a given G the number of constrains and variables is equal to the number of possible configurations of mutants/non-mutants in G ∼ 2 n . 2.- Given G, is it possible to compute the expected number of steps until arriving to absorption?

  11. Isothermal graphs (LHN) Given a directed � G = ( V , � E ), ∀ i ∈ V let deg + ( i ) be its outgoing degree: Define the stochastic matrix W = [ w ij ], where w ij = 1 / deg + ( i ) if � ( i , j ) ∈ � E and w ij = 0 otherwise. The same definition of W applies to undirected G , with w ij = 1 / deg ( i ). The temperature of i ∈ V is T i = � j ∈ V w ji A graph � G is isothermal if ∀ i , j ∈ V , T i = T j . b a   0 1 0 0 1 / 3 0 1 / 3 1 / 3   W =   0 1 / 2 0 1 / 2   1 / 2 1 / 2 0 0 c T b = 2 and T c = 1 / 3 d

  12. Computing the fixation probability 1 2 3 n 1 If � G is a digraph with a single n 2 G ( r ) = 1 source then f � n . n + 1 3 4 Isothermal Theorem (LHN) For a strongly connected graph � G s.t. ∀ i , j ∈ V we have T i = T j (i.e. W is bi-stochastic) then 1 − 1 f � G ( r ) = rn ≡ ρ r 1 − 1

  13. Undirected graphs The isothermal theorem also applies to undirected graphs. Given G undirected and connected, then G is ∆-regular iff W is bi-stochastic. If G is undirected and connected then f G ( r ) = ρ = 1 − 1 / r 1 − 1 / r n iff G is ∆ -regular . For example, if G is C n or K n then f G ( r ) = ρ . Notice: • if r > 1 then lim n →∞ f G ( r ) = 1 − 1 r . • if r < 1 then f G ( r ) = r n − r n − 1 → exponentially small. r n − 1

  14. Amplifiers and suppressors Given G (directed or undirected) and r , G is said to be an amplifier if f G ( r ) > ρ . G is said to be a suppressor if f G ( r ) < ρ . The star (LHN), (Broom, Rycht´ a. Proc.R. Soc. A 2008) 1 − 1 r 2 For r > 1 f G ( r ) = r 2 n > ρ 1 − 1 The star is an amplifier

  15. Suppressors The directed line and the burst have fixation probability 1 n < ρ , therefore they are examples of suppressors. How about non-directed graphs as suppressors? Mertzios, Nikoletseas,Ratopoulos,Spirakis, TCS 2013 The urchin For < r < 4 / 3 lim n →∞ f G ( r ) = 1 2 (1 − 1 r ) < ρ n -clique The urchin is an undirected graph suppressor

  16. Absorption time for undirected graphs Given undirected connected G = ( V , E ), with | V | = n , run a Moran process { S t } t ≥ 0 , where { S t } set of mutants at time t . Define the absorption time τ = min { t | S t = ∅ ∨ S t = V } . Theorem DGMRSS Given G undirected, for the Moran process { S t } starting with | S 1 | = 1: r − 1 n 3 , r 1. If r < 1, then E [ τ ] ≤ r − 1 n 4 , r 2. if r > 1, then E [ τ ] ≤ 3. if r = 1, then E [ τ ] ≤ n 6 .

  17. Sketch of the proof We bound E [ τ ] using a potential function that decreases in expectation until absorption. 1 Define the potential function φ ( S ) = � v ∈ S deg ( v ) Notice φ ( { v } ) ≥ 1 / n and 0 ≤ φ ( S τ ) ≤ n Use the following result from MC (Hajek, Adv Appl. Prob. 1983) If { X t } t ≥ 0 is a MC with state space Ω and there exist constants k 1 , k 2 > 0 and a φ : Ω → R + ∪ { 0 } s.t. (1) φ ( S ) = 0 , ∃ S ∈ Ω, (2) φ ( S ) ≤ k 1 , (3) E [ φ ( X t ) − φ ( X t +1 ) | X t = S ] ≥ k 2 , ∀ t ≥ 0 s.t. φ ( S ) > 0, then E [ τ ] ≤ k 1 / k 2 , where τ = min { t | φ ( S ) = 0 } .

  18. Sketch of the proof To compute evolution of E [ φ ( S t +1 ) − φ ( S t )]. u To show that the potential decreases v (increases) monotonically for r < 1 ( r > 1), ¯ S t consider the contribution of each ( u , v ) in S t the cut for S t +1 = S t ∪ { v } and to G S t +1 = S t \{ v } . 1. For r < 1, E [ φ ( S t +1 ) − φ ( S t )] < r − 1 n 3 < 0. 2. For r > 1, E [ φ ( S t +1 ) − φ ( S t )] ≥ (1 − 1 r ) 1 n 3 . 3. For r = 1, E [ φ ( S t +1 ) − φ ( S t )] = 0.

  19. Domination argument for r < 1 For any fixed initial S ⊂ V : Let { Y i } i ≥ 0 be a stochastic process as Moran’s, except if it arrives to state V , u.a.r. choose v V and exit to state V \{ v } . Let τ ′ = min { i | Y i = ∅} V − v Then, 1 τ ′ | Y 0 = S 1 − r n 3 φ ( S ) � � E [ τ | X 0 = S ] ≤ E ≤ 1 1 − r n 3 . ⇒ E [ τ ] ≤

  20. Domination argument for r > 1 For any fixed initial S ⊂ V : Define a process { Y i } i ≥ 0 as in Moran’s, except if arrives to state ∅ , u.a.r. choose v and exit to ∅ state { v } . { v } Let τ ′ = min { i | Y i = V } Then, rn 3 τ ′ | Y 0 = S � � E [ τ | X 0 = S ] ≤ E ≤ r − 1( φ ( G ) − φ ( S )) r − 1 n 4 . r ⇒ E [ τ ] ≤

  21. Proof for r = 1 For undirected G = ( V , E ) with r = 1, E [ τ ] ≤ φ ( V ) 2 n 4 ≤ n 6 . In this case E [ φ ( S t ) − φ ( S t − 1 )] does not change ⇒ Use a martingale argument At each t , the probability that φ changes is ≥ 1 / n 2 , and it changes by ≤ 1 / n . Dominate by process Z t ( φ t ), which increases in expectation until stopping time, when the process absorbs. Then E [ Z τ ] ≥ E [ Z 0 ] and we get a bound for E [ τ ].

  22. Aproximating f G ( r ) A FPRAS for a function f : A randomized algorithm A such that, given a 0 ≤ ǫ ≤ 1, for any input x , Pr [(1 − ǫ ) f ( x ) ≤ A ( x ) ≤ (1 + ǫ ) f ( x )] ≥ 3 4 , with a running time ≤ poly( | x | , 1 /ǫ ). Corollary to absorption bounds ◮ There is an FPRAS for computing the fixation probability, for any fixed r ≥ 1. ◮ There is an FPRAS for computing the extinction probability, for any fixed r < 1.

Recommend


More recommend