where are we knowledge engineering
play

Where are we? Knowledge Engineering Semester 2, 2004-05 Last time . - PowerPoint PPT Presentation

Introduction Introduction An Example An Example Inductive Logic Programming Inductive Logic Programming Summary Summary Where are we? Knowledge Engineering Semester 2, 2004-05 Last time . . . Michael Rovatsos Knowledge Evolution


  1. Introduction Introduction An Example An Example Inductive Logic Programming Inductive Logic Programming Summary Summary Where are we? Knowledge Engineering Semester 2, 2004-05 Last time . . . Michael Rovatsos ◮ Knowledge Evolution mrovatso@inf.ed.ac.uk ◮ Truth Maintenance Systems (JTMS, ATMS) ◮ Knowledge in Learning ◮ Explanation-based Learning Today . . . ◮ Inductive Logic Programming Lecture 18 – Knowledge Evolution II: Inductive Logic Programming 15th March 2005 Informatics UoE Knowledge Engineering 1 Informatics UoE Knowledge Engineering 303 Introduction Introduction An Example An Example Inductive Logic Programming Inductive Logic Programming Summary Summary Inductive Logic Programming (ILP) Today’s lecture ◮ Rigorous approach to knowledge-based inductive learning problem ◮ Methods for inducing general, first-order theories from ◮ We will first discuss an extended example examples ◮ . . . then present a method for top-down ILP ◮ Using FOL to represent learning hypotheses is useful where ◮ . . . look at inverse induction methods attribute-based mathods (e.g. decision trees) fail ◮ and finally discuss the ability of ILP to make discoveries ◮ In particular: ILP allows for capturing relationships between objects rather than only their attributes ◮ Hypotheses generated are relatively easy for humans to understand Informatics UoE Knowledge Engineering 304 Informatics UoE Knowledge Engineering 305

  2. Introduction Introduction An Example An Example Inductive Logic Programming Inductive Logic Programming Summary Summary Example Example ◮ Corresponding logical facts: ◮ Recall entailment constraint of general knowledge-based Father ( Philip , Charles ) Father ( Philip , Anne ) induction problem: Mother ( Mum , Margaret ) Mother ( Mum , Elizabeth ) Background ∧ Hypothesis ∧ Descriptions | = Classifications Married ( Diana , Charles ) Married ( Elizabeth , Philip ) Male ( Philip ) Male ( Charles ) ◮ Example: learning family relationships from examples Female ( Beatrice ) Female ( Margaret ) ◮ Descriptions given by following family tree: . . . . . . George Mum ◮ Target concept to be learned Grandparent , complete set of classifications would be 20 × 20 = 400 facts of the form Spencer Kydd Elizabeth P h i l i p Margaret Grandparent ( Mum , Charles ) Grandparent ( Elizabeth , Beatrice ) Diana Charles Anne Mark Andrew Sarah Edward ¬ Grandparent ( Mum , Harry ) ¬ Grandparent ( Spencer , Peter ) . . . . . . William Harry Peter Zara Beatrice Eugenie Informatics UoE Knowledge Engineering 306 Informatics UoE Knowledge Engineering 307 Introduction Introduction An Example An Example Inductive Logic Programming Inductive Logic Programming Summary Summary Example Example ◮ Suppose Background is empty ◮ Additional background knowledge can be used to obtain more ◮ One possible hypothesis: concise hypotheses Grandparent ( x , y ) ⇔ [ ∃ z Mother ( x , z ) ∧ Mother ( z , y )] ◮ Suppose we know Parent ( x , y ) ⇔ [ Mother ( x , y ) ∨ Father ( x , y )] ∨ [ ∃ z Mother ( x , z ) ∧ Father ( z , y )] ◮ Then we could represent our previous hypothesis as ∨ [ ∃ z Father ( x , z ) ∧ Mother ( z , y )] ∨ [ ∃ z Father ( x , z ) ∧ Father ( z , y )] Grandparent ( x , y ) ⇔ [ ∃ z Parent ( x , z ) ∧ Parent ( z , y )] ◮ What would an attribute-based learning algorithm do here: ◮ Turn pairs into objects: Grandparent ( � Mum , Charles � ) ◮ Even more interesting property of ILP algorithms: creating ◮ Descriptions hard to represent, new predicates (e.g. Parent ) e.g. FirstElementIsMotherOfElizabeth ( � Mum , Charles � ) ◮ Constructive induction : one of the hardest problems in ◮ Definition of Grandparent would become a large disjunction machine learning, but some ILP methods can do it! with no generalisation capabilities ◮ We discuss two methods: a generalisation of decision-tree ◮ Pincipal advantage of ILP: applicability to relational predicates methods & technique based on inverting resolution proofs can cover much wider range of problems Informatics UoE Knowledge Engineering 308 Informatics UoE Knowledge Engineering 309

  3. Introduction Introduction An Example Top-Down Inductive Learning Methods An Example Top-Down Inductive Learning Methods Inductive Logic Programming Inductive Learning with Inverse Induction Inductive Logic Programming Inductive Learning with Inverse Induction Summary Summary FOIL: Top-Down Inductive Learning Example Example: we are trying to learn the Grandfather relation ◮ Grow a hypothesis starting from a very general rule, but using 1. Split examples into positive and negative ones (12/388): a set of first-order clauses rather than a decision tree +: � Mum , Charles � , � Elizabeth , Beatrice � (clauses used are Horn clauses with negation as failure) -: � Mum , Harry � , � Spencer , Peter � ◮ More specialised clauses are generated by adding conditions to 2. Construct a set of clauses, each with Grandfather ( x , y ) as a the rule in the following way: head ◮ Literals can be added using predicates (including goal ◮ Start with true ⇒ Grandfather ( x , y ) predicate) with only variables as their arguments ◮ This classifies negative examples as true, specialise it ◮ Each literal must include at least one variable already ◮ Generate possible hypotheses by adding a literal to the LHS: appearing in the rule ◮ Equality and inequality constraints, arithmetic comparisons Father ( x , y ) ⇒ Grandfather ( x , y ) ◮ Large branching factor, but typing information may be used to Parent ( x , y ) ⇒ Grandfather ( x , y ) reduce it Father ( x , z ) ⇒ Grandfather ( x , y ) ◮ Heuristic for choice of literal similar to information gain, and ◮ Prefer the one that classifies most data correctly (here: the third one) hypotheses that are longer than the total length of examples are removed 3. Repeat these steps until all data is correctly classified Informatics UoE Knowledge Engineering 310 Informatics UoE Knowledge Engineering 311 Introduction Introduction An Example Top-Down Inductive Learning Methods An Example Top-Down Inductive Learning Methods Inductive Logic Programming Inductive Learning with Inverse Induction Inductive Logic Programming Inductive Learning with Inverse Induction Summary Summary Inductive Learning with Inverse Resolution Example ◮ Take positive example Grandparent ( George , Anne ) and start ◮ Basic idea: inverting the normal deductive proof process with empty clause, i.e. contradiction and construct the ◮ Recall resolution rule: following proof backwards: α ∨ β, ¬ β ∨ γ � Parent ( x,z ) > � Parent ( z,y ) > Grandparent ( x,y ) Parent ( George,Elizabeth ) α ∨ γ { x / George, z / Elizabeth } � Parent ( Elizabeth,y ) > Grandparent ( George,y ) Parent ( Elizabeth,Anne ) ◮ Resolution is complete, so one must be able to prove { y / Anne } Grandparent ( George,Anne ) Grandparent ( George,Anne ) � Background ∧ Hypothesis ∧ Descriptions | = Classifications ◮ If we can “run the proof backward”, we should be able to find ◮ write ¬ Parent ( x , z ) ∨ ¬ Parent ( z , y ) ∨ Grandparent ( z , y ) as Hypothesis such that proof succeeds Parent ( x , z ) ∧ ¬ Parent ( z , y ) ⇒ Grandparent ( z , y ) ◮ We have a resolution proof that descriptions, hypothesis and ◮ Inverse single resolution step takes the resolvent and produces background knowledge entail the classification two clauses or the resolvent and one clause and produces one Grandparent ( George , Anne ) new clause Informatics UoE Knowledge Engineering 312 Informatics UoE Knowledge Engineering 313

Recommend


More recommend