Deduction and Induction A Match Made in Heaven The Machine Inference Stephan Schulz Learning Engine
Deduction and Induction A Match Made in Heaven or a Deal with the Devil? The Machine Inference Stephan Schulz Learning Engine
Agenda ◮ Search and choice points in saturating theorem proving ◮ Basic questions about learning ◮ Learning from performance data Classification and heuristic selection ◮ Parameters for clause selection ◮ ◮ Learning from proofs and search graphs Proof extraction ◮ Learning clause evaluations (?) ◮ ◮ Conclusion 2
Theorem Proving: Big Picture Real World Problem Formalized Problem 8 X : human ( X ) ! mortal ( X ) 8 X : philosopher ( X ) ! human ( X ) philosopher ( socrates ) ? | = mortal ( socrates ) Proof ATP or Countermodel Proof Search or Timeout 3
Contradiction and Saturation ◮ Proof by contradiction Assume negation of conjecture ◮ Formula Show that axioms and negated conjecture imply ◮ set falsity ◮ Saturation Convert problem to Clause Normal Form ◮ Clausifier Systematically enumerate logical consequences of ◮ axioms and negated conjecture Goal: Explicit contradiction (empty clause) ◮ ◮ Redundancy elimination Use contracting inferences to simplify or eliminate ◮ some clauses Equi- satisfiable clause set 4
Contradiction and Saturation ◮ Proof by contradiction Assume negation of conjecture ◮ Formula Show that axioms and negated conjecture imply ◮ set falsity ◮ Saturation Convert problem to Clause Normal Form ◮ Clausifier Systematically enumerate logical consequences of ◮ axioms and negated conjecture Goal: Explicit contradiction (empty clause) ◮ ◮ Redundancy elimination Use contracting inferences to simplify or eliminate ◮ some clauses Equi- satisfiable Search control problem: How and in which order do we clause set enumerate consequences? 4
Proof Search and Choice Points ◮ First-order logic is semi-decidable Provers search for proof in infinite space ◮ . . . of possible derivations ◮ . . . of possible consequences ◮ ◮ Major choice points of Superposition calculus: Term ordering (which terms are bigger) ◮ (Negative) literal selection ◮ Selection of clauses for inferences (with the given clause algorithm) ◮ 5
Term Ordering and Literal Selection ◮ Negative Superposition with selection C ∨ s ≃ t D ∨ u �≃ v ( C ∨ D ∨ u [ p ← t ] �≃ v ) σ if σ = mgu( u | p , s ) ◮ ◮ and ( s ≃ t ) σ is ≻ -maximal in ( C ∨ s ≃ t ) σ ◮ and s is ≻ -maximal in ( s ≃ t ) σ ◮ and u ≃ v is selected in D ∨ u �≃ v ◮ and u is ≻ -maximal in ( s ≃ t ) σ ◮ Choice points: ≻ is a ground-total rewrite ordering ◮ ◮ Consistent throughout the proof search ◮ I.e. in practice determined up-front Any negative literal can be selected ◮ ◮ Current practice: Fixed scheme picked up-front 6
The Given-Clause Algorithm P (processed clauses) ◮ Aim: Move everything from U to P g = ☐ ? g U (unprocessed clauses) 7
The Given-Clause Algorithm P (processed clauses) ◮ Aim: Move everything from U to P g = ☐ ◮ Invariant: All generating ? inferences with premises Gene- from P have been rate performed g U (unprocessed clauses) 7
The Given-Clause Algorithm P (processed clauses) ◮ Aim: Move everything from U to P g = ☐ ◮ Invariant: All generating ? inferences with premises Simpli- Gene- fiable? from P have been rate performed g ◮ Invariant: P is interreduced Simplify U (unprocessed clauses) 7
The Given-Clause Algorithm P (processed clauses) ◮ Aim: Move everything from U to P g = ☐ ◮ Invariant: All generating ? inferences with premises Simpli- Gene- fiable? from P have been rate performed g Cheap Simplify ◮ Invariant: P is interreduced Simplify ◮ Clauses added to U are U simplified with respect (unprocessed clauses) to P 7
Choice Point Clause Selection P (processed clauses) g = ☐ ? Simpli- Gene- fiable? rate g Cheap Simplify Simplify U (unprocessed clauses) U (unprocessed clauses) 8
Choice Point Clause Selection P (processed clauses) g = ☐ ? Simpli- Gene- fiable? rate g Cheap Simplify Simplify U Choice (unprocessed clauses) Point U (unprocessed clauses) 8
Induction for Deduction ◮ Question 1: What to learn from? Performance data (prover is a black box) ◮ Proofs (only final result of search is visible) ◮ Proof search graphs (most of search is visible) ◮ ◮ Question 2: What to learn? ? Here: Learn strategy selection ◮ Here: Learn parameterization for clause selection ◮ heuristics Here: Learn new clause evaluation functions ◮ . . . ◮ 9
Automatic Strategy Selection 10
Strategy Selection Definition: A strategy is a collection of all search control parameters ◮ Term ordering ◮ Literal selection scheme ◮ Clause selection heuristic ◮ . . . (minor parameters) 11
Strategy Selection Definition: A strategy is a collection of all search control parameters ◮ Term ordering ◮ Literal selection scheme ◮ Clause selection heuristic ◮ . . . (minor parameters) ◮ Observation: Different problems are simple for different strategies ◮ Question: Can we determine a good heuristic (or set of heuristics) up-front? ◮ Original: Manually coded automatic modes Based on developer intuition/insight/experience ◮ Limited success, high maintenance ◮ ◮ State of the art: Automatic generation of automatic modes 11
“Learning” Heuristic Selection TPTP problem library 12
“Learning” Heuristic Selection TPTP problem library 12
“Learning” Heuristic Selection Feature-based classification TPTP problem library 12
“Learning” Heuristic Selection Feature-based classification Assign strategies to classes based on collected performance data from previous experiments • Simplest: Always pick best TPTP problem library strategy in class • If no data, pick globally best 12
“Learning” Heuristic Selection Feature-based classification Assign strategies to classes based on collected performance data from previous experiments • Simplest: Always pick best TPTP problem library strategy in class • If no data, pick globally best Example features • Number of clausse • Arity of symbols • Unit/Horn/Non-horn 12
Auto Mode Performance 11000 10000 9000 E 1.8 Auto E 1.8 Best 8000 7000 6000 0 50 100 150 200 250 300 TPTP 5.6.0 CNF&FOF problems 13
A Caveat Feature-based classification Assign strategies to classes based on collected performance data from previous experiments • Simplest: Always pick best TPTP problem library strategy in class • If no data, pick globally best Example features • Number of clausse • Arity of symbols • Unit/Horn/Non-horn 14
A Caveat Feature-based classification Assign strategies to classes based on collected performance data from previous experiments • Simplest: Always pick best TPTP problem library strategy in class • If no data, pick globally best Example features • Number of clausse • Arity of symbols • Unit/Horn/Non-horn Features based on developer… • …intuition • …insight • …experience 14
: a better auto mode ( that is th f.IE#iiEFEtI:tt - practical ) goal And b) to better which understand features influence Search ( that is the theoretical goal ) Current Work: Learning Classification : ◮ Characterize problems by performance vectors Which strategy solved the problem how fast? ◮ ◮ Unsupervised clustering of problems based om performance Each cluster contains problems on which the ◮ Literature be me would food here the E same strategies perform well , e.g. - paper S C 1.8 and brainiac ') and , Something clustering on . ◮ Feature extraction: Try to find characterization of clusters Quite ford for stat ! a Thanks . E.g. based on feature set ◮ E.g. using nearest-neighbour approaches ◮ My Bachelor Student Ayatallah just started work on this topic - results in 6 months 15
Learning parameterization for clause selection heuristics 16
Reminder: Choice Point Clause Selection P (processed clauses) g = ☐ ? Simpli- Gene- fiable? rate g Cheap Simplify Simplify U Choice (unprocessed clauses) Point U (unprocessed clauses) 17
Basic Approaches to Clause Selection ◮ Symbol counting Pick smallest clause in U ◮ |{ f ( X ) � = a , P ( a ) � = $ true , g ( Y ) = f ( a ) }| = 10 ◮ ◮ FIFO Always pick oldest clause in U ◮ ◮ Flexible weighting Symbol counting, but give different weight to different symbols ◮ E.g. lower weight to symbols from goal! ◮ E.g. higher weight for symbols in inference positions ◮ ◮ Combinations Interleave different schemes ◮ 18
Recommend
More recommend