Linear-in- Δ lower bounds in the LOCAL model Mika Göös, University of Toronto Juho Hirvonen , Aalto University & HIIT Jukka Suomela, Aalto Univesity & HIIT � PODC 16.7.2014
This work The first linear-in- Δ lower bound for a natural graph problem in the LOCAL model Fractional maximal matching: - There is no o ( Δ )-algorithm, independent of n - There is an O ( Δ )-algorithm, independent of n ( Δ = maximum degree, n = number of vertices) 2
Matching 0 0 0 1 1 0 Matching assigns weight 1 to matched edges and weight 0 to the rest 3
Fractional matching 0.3 0.4 0.1 0.3 0.4 0.1 FM is a linear relaxation of matching: weights of the incident edges sum up to at most 1 4
Maximal fractional matching 0.3 0.4 0.1 0.3 0.4 0.1 A node is saturated , if the sum of the weights of the incident edges is equal to one 5
Maximal fractional matching 0.3 0.3 0.4 0.4 0.1 0.3 0.1 0.2 0.4 0.1 0.4 0.1 The fractional matching is maximal , if no two unsaturated nodes are adjacent 6
Standard LOCAL model - Synchronous communication - No bandwidth restrictions - Running time = number of communication rounds - Both deterministic and randomized algorithms 7
This work The first linear-in- Δ lower bound for a natural graph problem in the LOCAL model Fractional maximal matching: - There is no o ( Δ )-algorithm, independent of n - There is an O ( Δ )-algorithm, independent of n ( Δ = maximum degree, n = number of vertices) 8
Prior work Coordination problems: - Maximal matching - Maximal independent set - ( Δ +1)-coloring Algorithms O ( Δ + log* n ) also O(polylog( n ) Lower bounds Ω (log* n ) and Ω (log Δ ) [Linial ’92] [Kuhn et al. ’05] 9
Prior work Coordination problems: - Maximal matching - Maximal independent set - ( Δ +1)-coloring Algorithms O ( Δ + log* n ) also O(polylog( n ) Lower bounds Ω (log* n ) and Ω (log Δ ) [Linial ’92] [Kuhn et al. ’05] 10
The Proof
The Proof A short guide - Step 0: Introduce models EC, PO, OI and ID - Step 1: Ω ( Δ )-lower bound in the EC-model - Step 2: Simulation result EC ↝ PO ↝ OI ↝ ID - Step 3: ID ↝ Randomized algorithms 12
The Proof A short guide - Step 0: Introduce models EC, PO, OI and ID - Step 1: Ω ( Δ )-lower bound in the EC-model - Step 2: Simulation result EC ↝ PO ↝ OI ↝ ID - Step 3: ID ↝ Randomized algorithms 13
Edge coloring (EC) EC ↝ PO ↝ OI ↝ ID ↝ R 14
Port-numbering and orientation (PO) 1 2 1 2 3 3 2 1 1 2 4 1 EC ↝ PO ↝ OI ↝ ID ↝ R 15
Port-numbering and orientation (PO) EC ↝ PO ↝ OI ↝ ID ↝ R 16
Unique Identifiers (ID) 1 18 19 71 8 EC ↝ PO ↝ OI ↝ ID ↝ R 17
Order Invariant (OI) 1 18 2 15 ≈ 19 71 8 31 41 5 EC ↝ PO ↝ OI ↝ ID ↝ R 18
The Proof A short guide - Step 0: Introduce models EC, PO, OI and ID - Step 1: Ω ( Δ )-lower bound in the EC-model - Step 2: Simulation result EC ↝ PO ↝ OI ↝ ID - Step 3: ID ↝ Randomized algorithms 19
Loopy graphs k =2 k =3 A graph is k-loopy , if it has at least k self-loops at each node EC ↝ PO ↝ OI ↝ ID ↝ R 20
Loopy graphs Loopy graphs are a compact representation of simple graphs with lots of symmetry EC ↝ PO ↝ OI ↝ ID ↝ R 21
Loopy graphs A loopy graph can be unfolded to get a simple graph EC ↝ PO ↝ OI ↝ ID ↝ R 22
Loopy graphs A loopy graph can be unfolded to get a simple graph EC ↝ PO ↝ OI ↝ ID ↝ R 23
Loopy graphs loopy graphs ≈ infinite trees EC ↝ PO ↝ OI ↝ ID ↝ R 24
Loopy graphs Key observation: a maximal fractional matching must saturate all nodes of a loopy graph! EC ↝ PO ↝ OI ↝ ID ↝ R 25
EC lower bound G H EC ↝ PO ↝ OI ↝ ID ↝ R 26
EC lower bound GG GH HH EC ↝ PO ↝ OI ↝ ID ↝ R 27
EC lower bound EC ↝ PO ↝ OI ↝ ID ↝ R 28
The Proof A short guide to the proof - Step 0: Introduce models EC, PO, OI and ID - Step 1: Ω ( Δ )-lower bound in the EC-model - Step 2: Simulation result EC ↝ PO ↝ OI ↝ ID - Step 3: ID ↝ Randomized algorithms 29
EC ↝ PO
EC ↝ PO Assume we have an o( Δ )-time algorithm A for maximal edge packing in the PO model EC ↝ PO ↝ OI ↝ ID ↝ R 31
EC ↝ PO Transform EC graph into PO graph by replacing each edge with two oriented edges EC ↝ PO ↝ OI ↝ ID ↝ R 32
EC ↝ PO 0.3 0.25 0.45 0.45 0.15 0.2 0.1 0 0.1 Simulate the PO-algorithm A and combine the weights of the corresponding edges EC ↝ PO ↝ OI ↝ ID ↝ R 33
EC ↝ PO We get an o( Δ )-algorithm in the EC-model, which is a contradiction EC ↝ PO ↝ OI ↝ ID ↝ R 34
PO ↝ OI
PO ↝ OI - Similar technology as Göös et al. (2012) - Now we do not need any approximation guarantees EC ↝ PO ↝ OI ↝ ID ↝ R 36
PO ↝ OI Assume we have a PO-algorithm A We use port numbers and orientation to get a local ordering EC ↝ PO ↝ OI ↝ ID ↝ R 37
PO ↝ OI v v G U (G) Take the universal cover of G EC ↝ PO ↝ OI ↝ ID ↝ R 38
Canonically 53 31 45 52 ordered tree 39 51 13 25 37 44 50 41 14 38 49 15 29 40 30 43 48 23 47 27 1 9 17 35 42 46 2 36 3 10 16 21 28 34 4 24 33 22 5 11 19 26 32 6 20 7 12 18 8 EC ↝ PO ↝ OI ↝ ID ↝ R
53 Embed U (G) 31 45 52 39 51 13 25 37 44 50 41 14 38 49 15 29 40 30 43 48 23 47 27 1 9 17 35 42 46 v 2 36 3 10 16 21 28 34 4 24 33 22 5 11 19 26 32 6 20 7 12 18 8 EC ↝ PO ↝ OI ↝ ID ↝ R
PO ↝ OI It is possible to make a PO-graph an OI-graph locally Use this to simulate A EC ↝ PO ↝ OI ↝ ID ↝ R 41
OI ↝ ID
OI ↝ ID Use the OI ↝ ID lemma of Naor and Stockmeyer (1995) (essentially Ramsey’s Theorem) The idea is to force any ID-algorithm A to behave like an OI-algorithm on some inputs EC ↝ PO ↝ OI ↝ ID ↝ R 43
OI ↝ ID Trick: consider an algorithm A* that simulates A and outputs 1 at saturated nodes and 0 elsewhere to apply the Lemma This forces all nodes to be saturated in A in loopy neighborhoods Any change must propagate outside A ’s run time EC ↝ PO ↝ OI ↝ ID ↝ R 44
The Proof A short guide - Step 0: Introduce models EC, PO, OI and ID - Step 1: Ω ( Δ )-lower bound in the EC-model - Step 2: Simulation result EC ↝ PO ↝ OI ↝ ID - Step 3: ID ↝ Randomized algorithms 45
Randomized algorithms Idea: Reduce random algorithms back to deterministic ones Again use a lemma of Naor and Stockmeyer (1995) EC ↝ PO ↝ OI ↝ ID ↝ R 46
Summary This work Fractional maximal matching has complexity Θ ( Δ ) Open questions What is the complexity of maximal matching ? What is the complexity of 2-colored maximal matching ? 47
Recommend
More recommend