meta complexity theorems for bottom up logic programs
play

Meta-Complexity Theorems for Bottom-up Logic Programs Harald - PowerPoint PPT Presentation

Meta-Complexity Theorems for Bottom-up Logic Programs Harald Ganzinger Max-Planck-Institut f ur Informatik David McAllester ATT Bell-Labs Research Introduction 2 logic programming of efficient algorithms complexity analysis through


  1. Reachability in Graphs 8 r ( u ) O ( | V | ) s ( u ) O ( | V | ) e ( u, v ) r ( u ) r ( v )

  2. Reachability in Graphs 8 r ( u ) O ( | V | ) s ( u ) O ( | V | ) e ( u, v ) + O ( | E | ) r ( u ) r ( v ) Theorem Reachability can be decided in linear time.

  3. Interprocedural Reachability: Database 9 program facts 1 procedure main proc(main,2,6) 2 begin next(main,2,5) 3 declare x: int 4 read(x) call(main,p,5,6) 5 call p(x) 6 end 7 procedure p(a:int) proc(p,8,15) 8 begin 9 if a>0 then next(p,8,12) 10 read(g) 11 a:=a-g call(p,p,12,13) 12 call p(a) next(p,13,15) 13 print(a) 14 fi next(p,8,15) 15 end

  4. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) P ⇒ B P call ( Q, P, L c , R r ) proc ( P, B P , E P ) next ( Q, L, L ′ ) P ⇒ E P Q ⇒ L Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  5. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) proc ( P, B P , E P ) next ( Q, L, L ′ ) P ⇒ E P Q ⇒ L Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  6. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) proc ( P, B P , E P ) next ( Q, L, L ′ ) O ( n ) P ⇒ E P Q ⇒ L Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  7. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) proc ( P, B P , E P ) next ( Q, L, L ′ ) O ( n ) P ⇒ E P Q ⇒ L ∗ O (1) Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  8. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) O ( n ) proc ( P, B P , E P ) next ( Q, L, L ′ ) O ( n ) P ⇒ E P Q ⇒ L ∗ O (1) Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  9. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) O ( n ) proc ( P, B P , E P ) ∗ O (1) next ( Q, L, L ′ ) O ( n ) P ⇒ E P Q ⇒ L ∗ O (1) Q ⇒ L c Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  10. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) O ( n ) proc ( P, B P , E P ) ∗ O (1) next ( Q, L, L ′ ) O ( n ) P ⇒ E P ∗ O (1) Q ⇒ L ∗ O (1) Q ⇒ L c ∗ O (1) Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  11. Interprocedural Reachability: Rules 10 Read “ P ⇒ L ” as “in procedure P label L can be reached”. proc ( P, B P , E P ) O ( n ) P ⇒ B P call ( Q, P, L c , R r ) O ( n ) proc ( P, B P , E P ) ∗ O (1) next ( Q, L, L ′ ) O ( n ) P ⇒ E P ∗ O (1) Q ⇒ L ∗ O (1) Q ⇒ L c ∗ O (1) Q ⇒ L ′ Q ⇒ L r Theorem IPR ∗ ( D ) can be computed in time O ( n ), with n = | | D | | .

  12. Proof of the Meta-Complexity Theorem I 11 Assumption: all terms in fully shared form

  13. Proof of the Meta-Complexity Theorem I 11 Assumption: all terms in fully shared form Matching: in O (1) (for atoms in rules against atoms in D )

  14. Proof of the Meta-Complexity Theorem I 11 Assumption: all terms in fully shared form Matching: in O (1) (for atoms in rules against atoms in D ) Unary Rules A ⊃ B : matching of A against each atom in R ( D ), plus construction of B , costs total time O ( | R ( D ) | )

  15. Proof of the Meta-Complexity Theorem I 11 Assumption: all terms in fully shared form Matching: in O (1) (for atoms in rules against atoms in D ) Unary Rules A ⊃ B : matching of A against each atom in R ( D ), plus construction of B , costs total time O ( | R ( D ) | ) Note: programs not cons-free

  16. Proof of the Meta-Complexity Theorem I 11 Assumption: all terms in fully shared form Matching: in O (1) (for atoms in rules against atoms in D ) Unary Rules A ⊃ B : matching of A against each atom in R ( D ), plus construction of B , costs total time O ( | R ( D ) | ) Note: programs not cons-free Problem: avoiding O ( | R ( D ) | k ) for rules of length k

  17. Proof of the Meta-Complexity Theorem II 12 Data structure for rules ρ of the form p ( X, Y ) ∧ q ( Y, Z ) ⊃ r ( X, Y, Z )

  18. Proof of the Meta-Complexity Theorem II 12 Data structure for rules ρ of the form p ( X, Y ) ∧ q ( Y, Z ) ⊃ r ( X, Y, Z ) ρ [ Y ] p -list of ρ [ t ] q -list of ρ [ t ] p ( a,t ) p ( b,t ) q ( t,u ) q ( t,v ) p ( c,t ) q ( t,w ) p ( d,t ) p ( e,t ) q ( t,s )

  19. Proof of the Meta-Complexity Theorem II 12 Data structure for rules ρ of the form p ( X, Y ) ∧ q ( Y, Z ) ⊃ r ( X, Y, Z ) ρ [ Y ] p -list of ρ [ t ] q -list of ρ [ t ] p ( a,t ) p ( b,t ) q ( t,u ) q ( t,v ) p ( c,t ) q ( t,w ) p ( d,t ) p ( e,t ) q ( t,s ) Upon adding a fact p ( e, t ), fire all r ( e, t, z ), for z on the q -list of A [ t ].

  20. Proof of the Meta-Complexity Theorem II 12 Data structure for rules ρ of the form p ( X, Y ) ∧ q ( Y, Z ) ⊃ r ( X, Y, Z ) ρ [ Y ] p -list of ρ [ t ] q -list of ρ [ t ] p ( a,t ) p ( b,t ) q ( t,u ) q ( t,v ) p ( c,t ) q ( t,w ) p ( d,t ) p ( e,t ) q ( t,s ) Upon adding a fact p ( e, t ), fire all r ( e, t, z ), for z on the q -list of A [ t ]. The inference system can be transformed (maintaining π ) so that it contains only unary rules and binary rules of the form ρ .

  21. Remarks 13 • memory consumption often much smaller

  22. Remarks 13 • memory consumption often much smaller • if R ∗ ( D ) infinite, consider R ∗ ( D ) ∩ atoms(subterms( D )) ⇒ concept of local inference systems (Givan, McAllester 1993)

  23. Remarks 13 • memory consumption often much smaller • if R ∗ ( D ) infinite, consider R ∗ ( D ) ∩ atoms(subterms( D )) ⇒ concept of local inference systems (Givan, McAllester 1993) • in the presence of transitivity laws, complexity is in Ω( n 3 )

  24. II. Redundancy, Deletion, and Priorities

  25. Removal of Redundant Information 15 • redundant information causes inefficiency D = { . . . , dist ( x ) ≤ d, dist ( x ) ≤ d ′ , d ′ < d, . . . } ⇒ delete dist ( x ) ≤ d • Notation: antecedents to be deleted in parenthesis [ . . . ] . . . , [ A ] , . . . , A ′ , . . . , [ A ′′ ] , . . . ⊃ B • in the presence of deletion, computations are nondeterministic: P ⊃ Q [ Q ] ⊃ S [ Q ] ⊃ W ⇒ either S or W can be derived, but not both • non-determinism don’t-care and/or restricted by priorities

  26. Removal of Redundant Information 15 • redundant information causes inefficiency D = { . . . , dist ( x ) ≤ d, dist ( x ) ≤ d ′ , d ′ < d, . . . } ⇒ delete dist ( x ) ≤ d • Notation: antecedents to be deleted in parenthesis [ . . . ] . . . , [ A ] , . . . , A ′ , . . . , [ A ′′ ] , . . . ⊃ B • in the presence of deletion, computations are nondeterministic: P ⊃ Q [ Q ] ⊃ S [ Q ] ⊃ W ⇒ either S or W can be derived, but not both • non-determinism don’t-care and/or restricted by priorities

  27. Removal of Redundant Information 15 • redundant information causes inefficiency D = { . . . , dist ( x ) ≤ d, dist ( x ) ≤ d ′ , d ′ < d, . . . } ⇒ delete dist ( x ) ≤ d • Notation: antecedents to be deleted in parenthesis [ . . . ] . . . , [ A ] , . . . , A ′ , . . . , [ A ′′ ] , . . . ⊃ B • in the presence of deletion, computations are nondeterministic: P ⊃ Q [ Q ] ⊃ S [ Q ] ⊃ W ⇒ either S or W can be derived, but not both • non-determinism don’t-care and/or restricted by priorities

  28. Removal of Redundant Information 15 • redundant information causes inefficiency D = { . . . , dist ( x ) ≤ d, dist ( x ) ≤ d ′ , d ′ < d, . . . } ⇒ delete dist ( x ) ≤ d • Notation: antecedents to be deleted in parenthesis [ . . . ] . . . , [ A ] , . . . , A ′ , . . . , [ A ′′ ] , . . . ⊃ B • in the presence of deletion, computations are nondeterministic: P ⊃ Q [ Q ] ⊃ S [ Q ] ⊃ W ⇒ either S or W can be derived, but not both • non-determinism don’t-care and/or restricted by priorities

  29. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  30. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  31. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  32. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  33. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  34. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  35. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  36. Logic Programs with Priorities and Deletion 16 • rules can have antecedents to be deleted after firing • priorities assigned to rule schemes • computation states S contain positive and negative (deleted) atoms • A visible in S if A ∈ S and ¬ A �∈ S (deletions are permanent) • Γ ⊃ B applicable in S if – each atom in Γ is visible in S , and – rule application changes S (by adding B or some ¬ A ) • S visible to a rule if no higher-priority rule is applicable in S • computations are maximal sequences of applications of visible rules • the final state of a computation starting with D is called an ( R -) saturation of D

  37. Second Meta-Complexity Theorem 17 Let C = S 0 , S 1 , . . . , S T be a computation. Prefix firing in C : pair ( rσ, i ) such that for some 0 ≤ t < T : – r = A 1 ∧ . . . ∧ A i ∧ . . . ∧ A n ⊃ A 0 ∈ R – S t visible to r – A j σ visible in S t , for 1 ≤ j ≤ i

  38. Second Meta-Complexity Theorem 17 Let C = S 0 , S 1 , . . . , S T be a computation. Prefix firing in C : pair ( rσ, i ) such that for some 0 ≤ t < T : – r = A 1 ∧ . . . ∧ A i ∧ . . . ∧ A n ⊃ A 0 ∈ R – S t visible to r – A j σ visible in S t , for 1 ≤ j ≤ i Prefix count: π R ( D ) = max {| p . f . ( C ) | | C a computation from D }

  39. Second Meta-Complexity Theorem 17 Let C = S 0 , S 1 , . . . , S T be a computation. Prefix firing in C : pair ( rσ, i ) such that for some 0 ≤ t < T : – r = A 1 ∧ . . . ∧ A i ∧ . . . ∧ A n ⊃ A 0 ∈ R – S t visible to r – A j σ visible in S t , for 1 ≤ j ≤ i Prefix count: π R ( D ) = max {| p . f . ( C ) | | C a computation from D } Theorem [Ganzinger/McAllester 2001] Let R be an inference system such that R ( D ) is finite. Then some R ( D ) can be computed in time O ( | | D | | + π R ( D )).

  40. Second Meta-Complexity Theorem 17 Let C = S 0 , S 1 , . . . , S T be a computation. Prefix firing in C : pair ( rσ, i ) such that for some 0 ≤ t < T : – r = A 1 ∧ . . . ∧ A i ∧ . . . ∧ A n ⊃ A 0 ∈ R – S t visible to r – A j σ visible in S t , for 1 ≤ j ≤ i Prefix count: π R ( D ) = max {| p . f . ( C ) | | C a computation from D } Theorem [Ganzinger/McAllester 2001] Let R be an inference system such that R ( D ) is finite. Then some R ( D ) can be computed in time O ( | | D | | + π R ( D )). Proof as before, but also using constant-length priority queues

  41. Second Meta-Complexity Theorem 17 Let C = S 0 , S 1 , . . . , S T be a computation. Prefix firing in C : pair ( rσ, i ) such that for some 0 ≤ t < T : – r = A 1 ∧ . . . ∧ A i ∧ . . . ∧ A n ⊃ A 0 ∈ R – S t visible to r – A j σ visible in S t , for 1 ≤ j ≤ i Prefix count: π R ( D ) = max {| p . f . ( C ) | | C a computation from D } Theorem [Ganzinger/McAllester 2001] Let R be an inference system such that R ( D ) is finite. Then some R ( D ) can be computed in time O ( | | D | | + π R ( D )). Proof as before, but also using constant-length priority queues Note: again prefix firings count only once; priorities are for free

  42. Union-Find 18 find( x ) x ⇒ ! y x ⇒ y (Refl) y ⇒ z x ⇒ z x ⇒ ! x (N) (Comm) x ⇒ ! z union( y, z )

  43. Union-Find 18 find( x ) x ⇒ ! y x ⇒ y (Refl) y ⇒ z x ⇒ z x ⇒ ! x (N) (Comm) x ⇒ ! z union( y, z ) union( x, y ) x ⇒ ! z 1 y ⇒ ! z 2 union( x, y ) (Init) (Orient) find( x ) , z 1 ⇒ z 2 find( y ) We are interested in x . = y defined as ∃ z ( x ⇒ ! z ∧ y ⇒ ! z )

  44. Union-Find 18 O ( n 2 ) O ( n 2 ) find( x ) x ⇒ ! y x ⇒ y ∗ O ( n ) ∗ O ( n ) (Refl) y ⇒ z x ⇒ z x ⇒ ! x (N) (Comm) x ⇒ ! z union( y, z ) union( x, y ) x ⇒ ! z 1 y ⇒ ! z 2 union( x, y ) (Init) (Orient) find( x ) , z 1 ⇒ z 2 find( y ) Naive Knuth/Bendix completion

  45. Union-Find 18 O ( n 2 ) find( x ) [ [ x ⇒ ! y ] ] x ⇒ y O ( n ) ∗ O (1) ∗ O (1) (Refl) y ⇒ z x ⇒ z x ⇒ ! x (N) (Comm) x ⇒ ! z union( y, z ) [ [union( x, y )] ] [ [union( x, y )] ] x ⇒ ! z 1 x ⇒ ! z y ⇒ ! z 2 union( x, y ) y ⇒ ! z (Init) (Triv) (Orient) find( x ) , ⊤ z 1 ⇒ z 2 find( y ) Naive Knuth/Bendix completion + normalization (eager path compression)

  46. Union-Find 18 find( x ) [ [ x ⇒ ! y ] ] O ( n log n ) x ⇒ y ∗ O (1) (Refl) y ⇒ z x ⇒ z x ⇒ ! x (N) (Comm) weight( x, 1) x ⇒ ! z union( y, z ) [ [union( x, y )] ] [ [union( x, y )] ] x ⇒ ! z 1 , weight( z 1 , w 1 ) x ⇒ ! z y ⇒ ! z 2 , [ [weight( z 2 , w 2 )] ] union( x, y ) y ⇒ ! z w 1 ≤ w 2 (Init) (Triv) (Orient) find( x ) , ⊤ z 1 ⇒ z 2 find( y ) weight( z 2 , w 1 + w 2 ) + symmetric variant of (Orient) Naive Knuth/Bendix completion + normalization (eager path compression) + logarithmic merge

  47. Congruence Closure for Ground Horn Clauses 19 Extension to congruence closure: 7 more rules, guaranteed optimal complexity O ( m + n log n ), where m = | union assertions | , n = | (sub)terms |

  48. Congruence Closure for Ground Horn Clauses 19 Extension to congruence closure: 7 more rules, guaranteed optimal complexity O ( m + n log n ), where m = | union assertions | , n = | (sub)terms | Extension to ground Horn clauses with equality: 13 more rules

  49. Congruence Closure for Ground Horn Clauses 19 Extension to congruence closure: 7 more rules, guaranteed optimal complexity O ( m + n log n ), where m = | union assertions | , n = | (sub)terms | Extension to ground Horn clauses with equality: 13 more rules Theorem [Ganzinger/McAllester 01] Satisfiability of a set D of ground Horn clauses with equality can be decided in time | + n log n + min( m log n, n 2 )) where m is the number O ( | | D | of antecedents and input clauses and n is the number of | )) whenever m is in Ω( n 2 ). terms. This is optimal ( = O ( | | D |

  50. Congruence Closure for Ground Horn Clauses 19 Extension to congruence closure: 7 more rules, guaranteed optimal complexity O ( m + n log n ), where m = | union assertions | , n = | (sub)terms | Extension to ground Horn clauses with equality: 13 more rules Theorem [Ganzinger/McAllester 01] Satisfiability of a set D of ground Horn clauses with equality can be decided in time | + n log n + min( m log n, n 2 )) where m is the number O ( | | D | of antecedents and input clauses and n is the number of | )) whenever m is in Ω( n 2 ). terms. This is optimal ( = O ( | | D | Logic View: We can (partly) deal with logic programs with equality

  51. Congruence Closure for Ground Horn Clauses 19 Extension to congruence closure: 7 more rules, guaranteed optimal complexity O ( m + n log n ), where m = | union assertions | , n = | (sub)terms | Extension to ground Horn clauses with equality: 13 more rules Theorem [Ganzinger/McAllester 01] Satisfiability of a set D of ground Horn clauses with equality can be decided in time | + n log n + min( m log n, n 2 )) where m is the number O ( | | D | of antecedents and input clauses and n is the number of | )) whenever m is in Ω( n 2 ). terms. This is optimal ( = O ( | | D | Logic View: We can (partly) deal with logic programs with equality Applications: several program analysis algorithms (Steensgaard, Henglein)

  52. Formal Notion of Redundancy 20 Let ≻ a well-founded ordering on ground atoms. Definition A is redundant in S (denoted A ∈ Red ( S )) whenever A 1 , . . . , A n | = R A , with A i in S such that A i ≺ A .

  53. Formal Notion of Redundancy 20 Let ≻ a well-founded ordering on ground atoms. Definition A is redundant in S (denoted A ∈ Red ( S )) whenever A 1 , . . . , A n | = R A , with A i in S such that A i ≺ A . Properties stable under enrichments and under deletion of redundant atoms

  54. Formal Notion of Redundancy 20 Let ≻ a well-founded ordering on ground atoms. Definition A is redundant in S (denoted A ∈ Red ( S )) whenever A 1 , . . . , A n | = R A , with A i in S such that A i ≺ A . Properties stable under enrichments and under deletion of redundant atoms Definition S is saturated up to redundancy wrt R if R ( S \ Red ( S )) ⊆ S ∪ Red ( S ).

  55. Formal Notion of Redundancy 20 Let ≻ a well-founded ordering on ground atoms. Definition A is redundant in S (denoted A ∈ Red ( S )) whenever A 1 , . . . , A n | = R A , with A i in S such that A i ≺ A . Properties stable under enrichments and under deletion of redundant atoms Definition S is saturated up to redundancy wrt R if R ( S \ Red ( S )) ⊆ S ∪ Red ( S ). Theorem If deletion is based on redundancy then the result of every computation is saturated wrt R up to redundancy.

  56. Formal Notion of Redundancy 20 Let ≻ a well-founded ordering on ground atoms. Definition A is redundant in S (denoted A ∈ Red ( S )) whenever A 1 , . . . , A n | = R A , with A i in S such that A i ≺ A . Properties stable under enrichments and under deletion of redundant atoms Definition S is saturated up to redundancy wrt R if R ( S \ Red ( S )) ⊆ S ∪ Red ( S ). Theorem If deletion is based on redundancy then the result of every computation is saturated wrt R up to redundancy. Corollary Priorities are irrelevant logically ⇒ choose them so as to minimize prefix firings

  57. Deletions based on Redundancy 21 Criterion: If r = [ A 1 ] , . . . , [ A k ] , B 1 , . . . , B m ⊃ B and if S ∪ { A 1 σ, . . . , A k σ, B 1 σ, . . . , B m σ } is visible to r then A i σ ∈ Red ( S ∪ { B 1 σ, . . . , B m σ, Bσ } ) .

  58. Deletions based on Redundancy 21 Criterion: If r = [ A 1 ] , . . . , [ A k ] , B 1 , . . . , B m ⊃ B and if S ∪ { A 1 σ, . . . , A k σ, B 1 σ, . . . , B m σ } is visible to r then A i σ ∈ Red ( S ∪ { B 1 σ, . . . , B m σ, Bσ } ) . Union-find example: not so easy to check, need proof orderings ` a la Bachmair and Dershowitz

  59. Deletions based on Redundancy 21 Criterion: If r = [ A 1 ] , . . . , [ A k ] , B 1 , . . . , B m ⊃ B and if S ∪ { A 1 σ, . . . , A k σ, B 1 σ, . . . , B m σ } is visible to r then A i σ ∈ Red ( S ∪ { B 1 σ, . . . , B m σ, Bσ } ) . Union-find example: not so easy to check, need proof orderings ` a la Bachmair and Dershowitz Note: redundancy should also be efficiently decidable

  60. III. Instance-based Priorities

  61. Shortest Paths 23 [ [dist( x ) ≤ d ] ] dist( x ) ≤ d ′ dist( x ) ≤ d d ′ < d c x → y (Init) (Upd) (Add) dist(src) ≤ 0 ⊤ dist( y ) ≤ c + d

  62. Shortest Paths 23 [ [dist( x ) ≤ d ] ] dist( x ) ≤ d ′ dist( x ) ≤ d d ′ < d c x → y (Init) (Upd) (Add) dist(src) ≤ 0 ⊤ dist( y ) ≤ c + d Correctness: obvious; deletion is based on redundancy

  63. Shortest Paths 23 [ [dist( x ) ≤ d ] ] dist( x ) ≤ d ′ dist( x ) ≤ d d ′ < d c x → y (Init) (Upd) (Add) dist(src) ≤ 0 ⊤ dist( y ) ≤ c + d Correctness: obvious; deletion is based on redundancy Priorities (Dijkstra): always choose an instance of (Add) where d is minimal ⇒ allow for instance-based rule priorities ( Init ) > ( Upd ) > ( Add )[ n/d ] > ( Add )[ m/d ] , for m > n

  64. Shortest Paths 23 [ [dist( x ) ≤ d ] ] dist( x ) ≤ d ′ dist( x ) ≤ d d ′ < d c x → y (Init) (Upd) (Add) dist(src) ≤ 0 ⊤ dist( y ) ≤ c + d Correctness: obvious; deletion is based on redundancy Priorities (Dijkstra): always choose an instance of (Add) where d is minimal ⇒ allow for instance-based rule priorities ( Init ) > ( Upd ) > ( Add )[ n/d ] > ( Add )[ m/d ] , for m > n Prefix firing count: O ( | E | ), but Dijkstra’s algorithm runs in time O ( | E | + | V | log | V | ) ⇒ one cannot expect a linear-time meta-complexity theorem for instance-based priorities

  65. Minimum Spanning Tree 24 Basis: Union-find module

  66. Minimum Spanning Tree 24 Basis: Union-find module c [ [ x ↔ y ] ] c x ⇒ ! z [ [ x ↔ y ] ] y ⇒ ! z (Add) mst( x, c, y ) (Del) union( x, y ) T

  67. Minimum Spanning Tree 24 Basis: Union-find module c [ [ x ↔ y ] ] c x ⇒ ! z [ [ x ↔ y ] ] y ⇒ ! z (Add) mst( x, c, y ) (Del) union( x, y ) T Priorities: (here needed for correctness) union − find > ( Del ) > ( Add )[ n/c ] > ( Add )[ m/c ] , for m > n

  68. Minimum Spanning Tree 24 Basis: Union-find module c [ [ x ↔ y ] ] c x ⇒ ! z [ [ x ↔ y ] ] y ⇒ ! z (Add) mst( x, c, y ) (Del) union( x, y ) T Priorities: (here needed for correctness) union − find > ( Del ) > ( Add )[ n/c ] > ( Add )[ m/c ] , for m > n Prefix firing count: O ( | E | + | V | log | V | )

  69. 3rd Meta-Complexity Theorem 25 Programs: as before but priorities of rule instances depend on first atom in antecedent and can be computed from the atom in constant time Theorem [in preparation] Let R be an inference system such that R ∗ ( D ) is finite. Then some R ( D ) can be computed in time O ( | | D | | + π R ( D ) log p ) where p is the number of different priorities assigned to atoms in R ∗ ( D ). Corollary 2nd meta-complexity theorem is a special case Proof technically involved; uses priority queues with log time operations; memory usage worse

  70. 3rd Meta-Complexity Theorem 25 Programs: as before but priorities of rule instances depend on first atom in antecedent and can be computed from the atom in constant time Theorem [in preparation] Let R be an inference system such that R ∗ ( D ) is finite. Then some R ( D ) can be computed in time O ( | | D | | + π R ( D ) log p ) where p is the number of different priorities assigned to atoms in R ∗ ( D ). Corollary 2nd meta-complexity theorem is a special case Proof technically involved; uses priority queues with log time operations; memory usage worse

Recommend


More recommend