average case complexity of maximum weighted independent
play

Average-case complexity of Maximum Weighted Independent Sets D. - PowerPoint PPT Presentation

Average-case complexity of Maximum Weighted Independent Sets D. Gamarnik, D. Goldberg, T. Weber (MIT) Physics of Algorithms 09, Santa Fe Wednesday, September 2, 2009 Outline Average-case analysis of computational complexity.


  1. Average-case complexity of Maximum Weighted Independent Sets D. Gamarnik, D. Goldberg, T. Weber (MIT) Physics of Algorithms ’09, Santa Fe Wednesday, September 2, 2009

  2. Outline • Average-case analysis of computational complexity. Independent Sets • A ‘corrected’ BP algorithm: the cavity expansion • Results: sufficient condition, hardness results. • Conclusion Wednesday, September 2, 2009

  3. Combinatorial Optimization with Random Costs • Goal: Study relation between randomness and computational complexity • Problems of interest: combinatorial optimization on graph - here: Maximum Weighted Independent Set • Rather than random graph, random costs • Identify relations between graph structure, cost distribution, and complexity • Techniques used: ‘message-passing’ algorithm, correlation decay analysis. Wednesday, September 2, 2009

  4. Max Weight Independent Sets • Graph (V,E), weights W ∈ R | V | + • Independent Set U: ∀ u, v ∈ U, ( u, v ) �∈ E • Max-Weight Independent Set (MWIS): given weights W, find U which maximizes � v ∈ U W v • Our setting: weights are random i.i.d variables from a joint distribution F • Arbitrary graph of bounded degree ∆ • Similar models in Gamarnik, Nowicki, Swircz [05], Sanghavi, Shah, Willsky [08] Wednesday, September 2, 2009

  5. Hardness facts • NP-hard, even for ∆ =3 • Poly-time approx algorithm of ratio : finds an IS ˜ U α such that W ( U ) U ) < α W ( ˜ • Poly-time Approximation Scheme: for all , α > 1 there exists a approx. algorithm of ratio α • Hastad [99] NP-hard to approximate within n β , β < 1 • Trevisan [01] NP-hard to approximate within ∆ 2 O ( √ log ∆ ) Wednesday, September 2, 2009

  6. A
first
result Theorem: Assume , P ( W > t ) = exp( − t ) ∆ ≤ 3 The problem can be approximated in polynomial time: for O ( | V | 2 ǫ − 2 ) any , in , there exists an algo. which finds an I.S. I ǫ > 0 such that P ( W ( I ∗ ) W ( I ) > 1 + ǫ ) < ǫ * Linear in |V| (with parallel computation, constant computation time) * Case exceptional? ∆ ≤ 3 * Case of Exponential weights exceptional? ~ Only distribution which works? ~ MWIS always easy with random weights? Wednesday, September 2, 2009

  7. Message
passing
for
MWIS !" $" %" !#" • Graphical
model
formula:on
of
MWIS:
 p ( x ) = 1 � � i ∈ V exp( w i x i ) i,j ∈ E 1 { x i + x j ≤ 1 } Z • Max‐product
(BP): � � k ∈ N i ,k � = i µ k → i (0) , e w i � � µ i → j (0) = max k ∈ N i ,k � = i µ k → i (1) µ i → j (1) = � k ∈ N i ,k � = i µ k → i (0) set M i → j = log( µ i → j (0) µ i → j (1) ) then: M i → j = max(0 , W i − � k ∈ N i ,k � = j M k → j ) 7 Wednesday, September 2, 2009

  8. LP
relaxa:on
for
MWIS
‐
connec:on
with
BP • IP
formula:on
of
MWIS: � max x i W i x i s.t. ∀ ( i, j ) ∈ E, x i + x j ≤ 1 ∀ i, x i ∈ { 0 , 1 } • LP
relaxa:on: � max x i W i x i s.t. ∀ ( i, j ) ∈ E, x i + x j ≤ 1 ∀ i, 0 ≤ x i ≤ 1 • LP
is
:ght
at
variable
i
if
 x i ∈ { 0 , 1 } • Fact
[Sanghavi,
Shah,
Willsky]:
If
BP
converges
at
variable
i,
 then
the
LP
is
:ght
at
i • Converse:
if
the
LP
is
not
:ght,
then
BP
does
not
converge IP
solu:on:
one
node,
opt.
cost:
1 LP
solu:on:
(1/2,1/2,1/2),
opt.
cost:
 3/2>1














:
LP
not
:ght 8 Wednesday, September 2, 2009

  9. The
Cavity
Expansion:
a
corrected
BP – We
try
to
compute
exactly B G ( i ) = W ( I ∗ G ) − W ( I ∗ G \{ i } ) if
>0,
then



















,
otherwise















(w.p.1)













 i ∈ I ∗ i �∈ I ∗ G G W ( I ∗ G ) = max( W i + W ( I ∗ G \{ i,j,k,l } , W ( I ∗ G \{ i } ) !" !" !" #" %" #" %" #" %" $ $ $ W ( I ∗ G \{ i } ) − 9 Wednesday, September 2, 2009

  10. The
Cavity
Expansion:
a
corrected
BP � �� – So: � B G ( i ) = max 0 , W i − W ( I ∗ G \{ i } ) − ( W ( I ∗ G \{ i,j,k,l } ) !" !" #" %" #" %" − $ $ !" $" !" $" − # # 10 Wednesday, September 2, 2009

  11. The
Cavity
Expansion:
a
corrected
BP W ( I ∗ G \{ i } ) − W ( I ∗ G \{ i,j,k,l } ) = !" $" !" $" − # # W ( I ∗ G \{ i } ) − W ( I ∗ G \{ i,j } ) + !" $" + !" $" − # # W ( I ∗ G \{ i,j } ) − W ( I ∗ G \{ i,j,k } ) + !" $" !" $" + − # # W ( I ∗ G \{ i,j,k } ) − W ( I ∗ G \{ i,j,k,l } ) !" $" !" $" − # # 11 Wednesday, September 2, 2009

  12. The
Cavity
Expansion:
a
corrected
BP W ( I ∗ G \{ i } ) − W ( I ∗ G \{ i,j,k,l } ) = !" $" !" $" − # # W ( I ∗ G \{ i } ) − W ( I ∗ G \{ i,j } ) + !" $" + !" $" − # # � � = B G \{ i } ( j ) W ( I ∗ G \{ i,j } ) − W ( I ∗ G \{ i,j,k } ) + !" $" !" $" + − # # � � = B G \{ i,j } ( k ) W ( I ∗ G \{ i,j,k } ) − W ( I ∗ G \{ i,j,k,l } ) !" $" !" $" − # # � � = B G \{ i,j,k } ( l ) 12 Wednesday, September 2, 2009

  13. Cavity
Expansion:
Summary • Cavity
Expansion
(for
IS): B G ( i ) = max(0 , W i − B G \{ i } ( j ) − B G \{ i,j } ( k ) − B G \{ i,j,k } ( l )) • BP
(for
IS): M G ( i ) = max(0 , W i − M G ( j ) − M G ( k ) − M G ( l )) • Generaliza:on
for
arbitrary
op:miza:on • Similar
approaches
(for
coun:ng):
Weitz
(06),
 Baya:,Gamarnik,Katz,
Nair,
Tetali
(07),
Jung
and
Shah
 (07)
 • CE
always
converges,
and
is
correct
at
termina:on • caveat:
running
:me
 O ( ∆ | V | ) • Fix:
interrupt
a^er
a
fixed
number
of
itera:ons
t 13 Wednesday, September 2, 2009

  14. Correla:on
Decay
analysis • Let













be
the
r‐step
approx
of
 B r G ( i ) B G ( i ) • Defini:on:
System
exhibits
 correla:on
decay
if
 | B r G ( i ) − B G ( i ) | → 0 exponen:ally
fast
(in
r) u • Implies:
wether
u
is
in
the
MWIS
is
 asympto:cally
independent
of
the
 graph
beyond
a
certain
boundary
 14 Wednesday, September 2, 2009

  15. Correla:on
Decay
analysis • Let













be
the
r‐step
approx
of
 B r G ( i ) B G ( i ) • Defini:on:
System
exhibits
 correla:on
decay
if
 | B r G ( i ) − B G ( i ) | → 0 exponen:ally
fast
(in
r) u • Implies:
wether
u
is
in
the
MWIS
is
 asympto:cally
independent
of
the
 graph
beyond
a
certain
boundary
 15 Wednesday, September 2, 2009

  16. Correla:on
Decay
analysis • Let













be
the
r‐step
approx
of
 B r G ( i ) B G ( i ) • Defini:on:
System
exhibits
 correla:on
decay
if
 | B r G ( i ) − B G ( i ) | → 0 exponen:ally
fast
(in
r) u • Implies:
wether
u
is
in
the
MWIS
is
 asympto:cally
independent
of
the
 graph
beyond
a
certain
boundary
 16 Wednesday, September 2, 2009

  17. Correla:on
Decay
analysis • Let













be
the
r‐step
approx
of
 B r G ( i ) B G ( i ) • Defini:on:
System
exhibits
 correla:on
decay
if
 | B r G ( i ) − B G ( i ) | → 0 exponen:ally
fast
(in
r) u • Implies:
wether
u
is
in
the
MWIS
is
 asympto:cally
independent
of
the
 graph
beyond
a
certain
boundary
 I ∗ = { i : B G ( i ) > 0 } • Recall • Candidate
solu:on:
 I r = { i : B r G ( i ) > 0 } 17 Wednesday, September 2, 2009

  18. Proof
sketch
of
near‐op:mality • Introduce
‘Lyapunov’
func:on
 L G ( i ) = E [exp( − B G ( i ))] • From
CE
and
expo
weights
assump:on,
find
a
recursion
on
 L G ( i ) the












: L G ( i ) = 1 − 1 / 2( L G \{ i } ( j ) L G \{ i,j } ( k )) • This
implies
a
non‐expansion
of
the
recursion
of
 L G • Prune
a
small
frac:on





of
the
nodes δ • This
implies
a
contrac:on
of
factor















 (1 − δ ) (1 − δ ) r + δ • A^er
r
steps,
error
is
 • minimize
delta
as
a
func:on
of
r
=>
correla:on
decay • Final
steps:
prove
that
if





























,
then
 B r I r ≈ I ∗ G ( i ) ≈ B G ( i ) 18 Wednesday, September 2, 2009

Recommend


More recommend