when perm a m g where m g is the set of perfect matchings
play

When Perm(A) = |M(G)| where M(G) is the - PowerPoint PPT Presentation

A PPROXIMATING T HE P ERMANENT Mark Jerrum and Alistair Sinclair Presented by Dzung Nguyen P ERMANENT P ROBLEM A = When Perm(A) = |M(G)| where M(G) is the set of perfect matchings of the bipartite graph G=(U, V,


  1. A PPROXIMATING T HE P ERMANENT Mark Jerrum and Alistair Sinclair Presented by Dzung Nguyen

  2. P ERMANENT P ROBLEM A =  When Perm(A) = |M(G)| where M(G)  is the set of perfect matchings of the bipartite graph G=(U, V, E) with (   , ) iff 1 u v E a i j ij rows   1 1 0 0   1 0 1 1     0 0 0 1   0 1 1 0   columns

  3. O UTLINE Reduction from counting problem to generating 1. problem. Markov Chain and converging speed. 2. Solution 3. Conclusions 4.

  4. U NIFORM GENERATOR AND RANDOMIZED C OUNTER  Counting the number of perfect matchings is self- reducible. Let be an arbitrary edge of G.  ( , ) e u v    ( ) ( \ ) { { } : ( \ { , })} M G M G e M e M M G u v  Claim: “If there exist a fully polynomial almost uniform generator of perfect matchings then there exists a fully polynomial randomized algorithm to compute the number of perfect matchings .”

  5. U NIFORM GENERATOR AND RANDOMIZED C OUNTER M(G) Aim: not select e  Estimate M(G1) select e M(G1)  Estimate ration M(G1)/M(G)  Sample t elements of M(G) to count number of elements in M(G1)  t is big enough and choose the best G1 which yield maximum ratio

  6. O UTLINE Reduction from counting problem to generating 1. problem. Markov Chain and converging speed. 2. Solution 3. Conclusions 4.

  7. M ARKOV C HAIN AND C ONVERGING S PEED  Given a finite state space N, Markov Chain on N is a series of states with transition matrix P.  ( ) X  0 t t  If the chain is ergodic, denote be the    ( )  i i N stationary distribution, the unique vector satisfying    P  and .   1 i  i N  The relative point wise distance (r.p.d.) after t steps:   t | | p   ij j ( ) max t   , i j N j  An ergodic Markov chain is said to be time- reversible if it satisfies the detailed balance      , condition: p p i j N ij i ji j

  8. M ARKOV C HAIN ( CONTINUE )  A time-reversible chain is represented by graph H of |N| vertices with the weight  The conductance of a time-reversible chain is defined by:

  9. O UTLINE Reduction from counting problem to generating 1. problem. Markov Chain and converging speed. 2. Solution 3. Conclusions 4.

  10. U NIFORM GENERATOR AND M ARKOV C HAIN  Suppose the Markov Chain on the set of elements S is ergodic and uniform distribution when the number of step go to infinite  Build a polynomial time almost uniform generator by simulating the chain with large enough t steps. t is poly(|S|)  What is the set of elements here? Set of perfect matching?  What is the transition matrix?

  11. S OLUTION : S is the set of perfect matchings and near-perfect matchings (two vertices are not matched). In any state , choose an edge uniform    ( , ) ( ) e u v E M M G random and then:  (i) If M is a perfect matching and , move to state    e M ' M M e  (ii) If M is a near-perfect matching and u, v are unmatched in M, move to   ' M M e  (iii) If M is a near-perfect matching, u is matched to w in M and v is unmatched in M, move to    ' ( , ) M M e u w ; symmetrically, if v is matched to w and u is unmatched, move to    ' ( , ) M M e w v  (iv) In all other cases, do nothing. With probability ½, the state moves to itself.

  12. R ESULTS  The process is ergodic and time-reversal with uniform stationary distribution.  In dense graph i.e. degree of each vertex is at least n/2, the process converges rapidly and at least 1/n 2 elements of S are perfect matching.

  13. C ONCLUSIONS  Show the relationship between generating and counting problems.  The relationship between the conductance of underlying graph and the converging speed.  The technique to prove the lower bound of the conductance.  Further: computer the permanent of all 0-1 matrix, matrix with non-negative elements.

Recommend


More recommend