On the Computational Complexity of Periodic Scheduling PhD defense Thomas Rothvoß
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i ))
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time (relative) deadline
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 )
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 ) u ( τ i ) = c ( τ i ) Utilization: p ( τ i )
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 ) u ( τ i ) = c ( τ i ) Utilization: p ( τ i ) Settings: ◮ static priorities ↔ dynamic priorities
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 ) u ( τ i ) = c ( τ i ) Utilization: p ( τ i ) Settings: ◮ static priorities ↔ dynamic priorities ◮ single-processor ↔ multi-processor
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 ) u ( τ i ) = c ( τ i ) Utilization: p ( τ i ) Settings: ◮ static priorities ↔ dynamic priorities ◮ single-processor ↔ multi-processor ◮ preemptive scheduling ↔ non-preemptive
Real-time Scheduling Given: (synchronous) tasks τ 1 , . . . , τ n with τ i = ( c ( τ i ) , d ( τ i ) , p ( τ i )) running time period (relative) deadline W.l.o.g.: Task τ i releases job of length c ( τ i ) at z · p ( τ i ) and absolute deadline z · p ( τ i ) + d ( τ i ) ( z ∈ N 0 ) u ( τ i ) = c ( τ i ) Utilization: p ( τ i ) Settings: ◮ static priorities ↔ dynamic priorities ◮ single-processor ↔ multi-processor ◮ preemptive scheduling ↔ non-preemptive Implicit deadlines: d ( τ i ) = p ( τ i ) Constrained deadlines: d ( τ i ) ≤ p ( τ i )
b b b Example: Implicit deadlines & static priorities c ( τ 1 ) = 1 d ( τ 1 ) = p ( τ 1 ) = 2 time 0 1 2 3 4 5 6 7 8 9 10 c ( τ 2 ) = 2 d ( τ 2 ) = p ( τ 2 ) = 5
b b b Example: Implicit deadlines & static priorities Theorem (Liu & Layland ’73) 1 Optimal priorities: p ( τ i ) for τ i ( Rate-monotonic schedule) c ( τ 1 ) = 1 d ( τ 1 ) = p ( τ 1 ) = 2 time 0 1 2 3 4 5 6 7 8 9 10 c ( τ 2 ) = 2 d ( τ 2 ) = p ( τ 2 ) = 5
b b b Example: Implicit deadlines & static priorities Theorem (Liu & Layland ’73) 1 Optimal priorities: p ( τ i ) for τ i ( Rate-monotonic schedule) c ( τ 1 ) = 1 d ( τ 1 ) = p ( τ 1 ) = 2 time 0 1 2 3 4 5 6 7 8 9 10 c ( τ 2 ) = 2 d ( τ 2 ) = p ( τ 2 ) = 5
b b b Example: Implicit deadlines & static priorities Theorem (Liu & Layland ’73) 1 Optimal priorities: p ( τ i ) for τ i ( Rate-monotonic schedule) c ( τ 1 ) = 1 d ( τ 1 ) = p ( τ 1 ) = 2 time 0 1 2 3 4 5 6 7 8 9 10 c ( τ 2 ) = 2 d ( τ 2 ) = p ( τ 2 ) = 5
Feasibility test for implicit-deadline tasks Theorem (Lehoczky et al. ’89) If p ( τ 1 ) ≤ . . . ≤ p ( τ n ) then the response time r ( τ i ) in a Rate-monotonic schedule is the smallest non-negative value s.t. � r ( τ i ) � � c ( τ i ) + c ( τ j ) ≤ r ( τ i ) p ( τ j ) j<i 1 machine suffices ⇔ ∀ i : r ( τ i ) ≤ p ( τ i ) .
Simultaneous Diophantine Approximation (SDA) Given: ◮ α 1 , . . . , α n ∈ Q ◮ bound N ∈ N ◮ error bound ε > 0 Decide: � � � α i − Z � ≤ ε � � ∃ Q ∈ { 1 , . . . , N } : max � � Q Q i =1 ,...,n
Simultaneous Diophantine Approximation (SDA) Given: ◮ α 1 , . . . , α n ∈ Q ◮ bound N ∈ N ◮ error bound ε > 0 Decide: � � � α i − Z � ≤ ε � � ∃ Q ∈ { 1 , . . . , N } : max � � Q Q i =1 ,...,n ⇔ ∃ Q ∈ { 1 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε
Simultaneous Diophantine Approximation (SDA) Given: ◮ α 1 , . . . , α n ∈ Q ◮ bound N ∈ N ◮ error bound ε > 0 Decide: � � � α i − Z � ≤ ε � � ∃ Q ∈ { 1 , . . . , N } : max � � Q Q i =1 ,...,n ⇔ ∃ Q ∈ { 1 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε ◮ NP -hard [Lagarias ’85]
Simultaneous Diophantine Approximation (SDA) Given: ◮ α 1 , . . . , α n ∈ Q ◮ bound N ∈ N ◮ error bound ε > 0 Decide: � � � α i − Z � ≤ ε � � ∃ Q ∈ { 1 , . . . , N } : max � � Q Q i =1 ,...,n ⇔ ∃ Q ∈ { 1 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε ◮ NP -hard [Lagarias ’85] ◮ Gap version NP -hard [R¨ ossner & Seifert ’96, Chen & Meng ’07]
Simultaneous Diophantine Approximation (2) Theorem (R¨ ossner & Seifert ’96, Chen & Meng ’07) Given α 1 , . . . , α n , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ { 1 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε O (1) O (1) ◮ ∄ Q ∈ { 1 , . . . , n log log n N } : max log log n ε i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ n
Simultaneous Diophantine Approximation (2) Theorem (R¨ ossner & Seifert ’96, Chen & Meng ’07) Given α 1 , . . . , α n , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ { 1 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε O (1) O (1) ◮ ∄ Q ∈ { 1 , . . . , n log log n N } : max log log n ε i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ n Theorem (Eisenbrand & R. - APPROX’09) Given α 1 , . . . , α n , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ { N/ 2 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ ε ◮ ∄ Q ∈ { 1 , . . . , 2 n O (1) · N } : max O (1) log log n · ε i =1 ,...,n |⌈ Qα i ⌋ − Qα i | ≤ n 2 ) n O (1) . even if ε ≤ ( 1
Directed Diophantine Approximation (DDA) Theorem (Eisenbrand & R. - APPROX’09) Given α 1 , . . . , α n , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ { N/ 2 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌉ − Qα i | ≤ ε i =1 ,...,n |⌈ Qα i ⌉ − Qα i | ≤ 2 n O (1) · ε O (1) ◮ ∄ Q ∈ { 1 , . . . , n log log n · N } : max 2 ) n O (1) . even if ε ≤ ( 1
Directed Diophantine Approximation (DDA) Theorem (Eisenbrand & R. - APPROX’09) Given α 1 , . . . , α n , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ { N/ 2 , . . . , N } : max i =1 ,...,n |⌈ Qα i ⌉ − Qα i | ≤ ε i =1 ,...,n |⌈ Qα i ⌉ − Qα i | ≤ 2 n O (1) · ε O (1) ◮ ∄ Q ∈ { 1 , . . . , n log log n · N } : max 2 ) n O (1) . even if ε ≤ ( 1 Theorem (Eisenbrand & R. - SODA’10) Given α 1 , . . . , α n , w 1 , . . . , w n ≥ 0 , N , ε > 0 it is NP -hard to distinguish ◮ ∃ Q ∈ [ N/ 2 , N ] : � n i =1 w i ( Qα i − ⌊ Qα i ⌋ ) ≤ ε i =1 w i ( Qα i − ⌊ Qα i ⌋ ) ≤ 2 n O (1) · ε O (1) log log n · N ] : � n ◮ ∄ Q ∈ [1 , n 2 ) n O (1) . even if ε ≤ ( 1
Hardness of Response Time Computation Theorem (Eisenbrand & R. - RTSS’08) Computing response times for implicit-deadline tasks w.r.t. to a Rate-monotonic schedule, i.e. solving � � n − 1 � � r � min r ≥ 0 | c ( τ n ) + c ( τ i ) ≤ r p ( τ i ) i =1 ( p ( τ 1 ) ≤ . . . ≤ p ( τ n ) ) is NP -hard (even to approximate within a O (1) log log n ). factor of n ◮ Reduction from Directed Diophantine Approximation
Mixing Set min c s s + c T y s + a i y i ≥ b i ∀ i = 1 , . . . , n R ≥ 0 s ∈ Z n y ∈
Mixing Set min c s s + c T y s + a i y i ≥ b i ∀ i = 1 , . . . , n R ≥ 0 s ∈ Z n y ∈ ◮ Complexity status? [Conforti, Di Summa & Wolsey ’08]
Mixing Set min c s s + c T y s + a i y i ≥ b i ∀ i = 1 , . . . , n R ≥ 0 s ∈ Z n y ∈ ◮ Complexity status? [Conforti, Di Summa & Wolsey ’08] Theorem (Eisenbrand & R. - APPROX’09) Solving Mixing Set is NP -hard. 1. Model Directed Diophantine Approximation (almost) as Mixing Set 2. Simulate missing constraint with Lagrangian relaxation
Recommend
More recommend