where are we knowledge engineering
play

Where are we? Knowledge Engineering In the last few lectures . . . - PowerPoint PPT Presentation

Knowledge Evolution Knowledge Evolution Truth Maintenance Systems Truth Maintenance Systems Knowledge in Learning Knowledge in Learning Summary Summary Where are we? Knowledge Engineering In the last few lectures . . . Semester 2, 2004-05


  1. Knowledge Evolution Knowledge Evolution Truth Maintenance Systems Truth Maintenance Systems Knowledge in Learning Knowledge in Learning Summary Summary Where are we? Knowledge Engineering In the last few lectures . . . Semester 2, 2004-05 ◮ Knowledge Synthesis Michael Rovatsos ◮ Automated Software Synthesis mrovatso@inf.ed.ac.uk ◮ Agents & Multiagent Systems ◮ Semantic Web & Knowledge Engineering I V U N E R S In the final two lectures . . . E I H T T Y O H ◮ Knowledge Evolution F G E R D B U I N ◮ Today: Lecture 17 – Knowledge Evolution I: TMS & EBL ◮ Belief Revision: Truth Maintenance Systems 11th March 2005 ◮ Knowledge in Learning: Explanation-Based Learning Informatics UoE Knowledge Engineering 1 Informatics UoE Knowledge Engineering 286 Knowledge Evolution Knowledge Evolution Truth Maintenance Systems Truth Maintenance Systems JTMS Knowledge in Learning Knowledge in Learning ATMS Summary Summary Knowledge Evolution Truth Maintenance Systems (TMS) ◮ In section on non-monotonic reasoning, we mentioned that ◮ So far, we discussed knowledge acquisition, representation & some inferences have only default status until more specific reasoning, and synthesis as if we are always building systems information is known from scratch ◮ More general problem: belief revision , i.e. if we add ¬ P to a ◮ In real-world applications, we expect our KBS to operate over KB that contains P , how do we make sure all inferences an extended period of time in an environment that changes drawn from P are retracted? ◮ How to deal with a changing world considering our current ◮ If P ⇒ Q , we have to retract Q as well . . . knowledge? ◮ but what if also R ⇒ Q ? ◮ Knowledge evolution denotes in this sense the evolution of ◮ Truth maintenance systems (TMS) deal with this problem existing knowledge in the light of new information ◮ Naive approach: ◮ Also an issue for human involvement in the design and ◮ Number all facts P 1 to P n in the order in which they were implementation of KBS, we will focus on computational added to the KB ◮ If P i is removed, go back to state before addition of P i and aspects add P i +1 to P n (and what was inferred from them) again ◮ Today: belief revision & learning with prior knowledge ◮ Simple, but impractical! Informatics UoE Knowledge Engineering 287 Informatics UoE Knowledge Engineering 288

  2. Knowledge Evolution Knowledge Evolution Truth Maintenance Systems JTMS Truth Maintenance Systems JTMS Knowledge in Learning ATMS Knowledge in Learning ATMS Summary Summary Justification-Based TMS (JTMS) Justification-Based TMS (JTMS) ◮ Based on idea of annotating each fact with its “justification” (set of logical sentences from which it was inferred) ◮ Obvious advantage: when retracting P , only those sentences derived from P have to be considered (not all those inferred ◮ Example: A forward-chaining KBS that ads sentences it can infer from existing ones automatically since P had been added) ◮ Using JTMS, it will add Q to the KB because of P and ◮ JTMS mark sentences as in or out (rather than deleting them P ⇒ Q and annotate it with { P , P ⇒ Q } completely) ◮ All inference chains are retained, useful if some facts might ◮ A sentence can have several justifications become true again ◮ If P is to be retracted from the KB, all sentences that require ◮ Of course, in practice sentences will be eventually deleted if P in every justification have to be removed, too never used again ◮ In the above example: Consider the following justification sets ◮ Additional advantage (apart from efficient retraction): speed for Q up of analysis of multiple hypothetical situations ◮ {{ P , P ⇒ Q } , { P , R ∨ P ⇒ Q }} Q will have to be removed ◮ {{ P , P ⇒ Q } , { R , R ∨ P ⇒ Q }} Q can be retained Informatics UoE Knowledge Engineering 289 Informatics UoE Knowledge Engineering 290 Knowledge Evolution Knowledge Evolution Truth Maintenance Systems JTMS Truth Maintenance Systems JTMS Knowledge in Learning ATMS Knowledge in Learning ATMS Summary Summary Example Assumption-Based TMS (ATMS) ◮ Consider exam schedule with exam e taking place in time-slot t denoted by Time ( e ) = t ◮ In a JTMS, only one state of the world is represented at a time ◮ Concrete schedule: a conjunction ◮ Idea of ATMS: label each sentence with a set of assumption Time ( KM ) = 6 ∧ Time ( KE ) = 2 ∧ . . . Time ( PMR ) = 12 sets that would make it true sentence holds if all ◮ Takes ( s , e ) denotes that a student s has to take exam e assumptions in one of the assumption sets hold ◮ Rule for exam clashes: ◮ Way of providing explanations, which may also include ∃ sTakes ( s , e ) ∧ Takes ( s , f ) ∧ Time ( e ) = Time ( f ) ⇒ Clash ( e , f ) assumptions (including contradictory ones) ◮ Consider Clash ( KE , KM ) with the following justification ◮ Idea: tag sentence “false” with all sets of contradictory assumptions { Takes ( Moe , KM ) , Takes ( Moe , KE ) , Time ( KM ) = 2 , Time ( KE ) = 2 , Takes ( Moe , KE ) ∧ Takes ( Moe , KM ) ∧ Time ( KE )= Time ( KM ) ⇒ Clash ( KM , KE ) } ◮ ATMS does not strive to reach a state of mutually consistent assumptions, all possibilities are kept in parallel (no ◮ Easy to check alternative schedules, e.g. by retracting backtracking necessary) Time ( KE ) = 2 and asserting Time ( KE ) = 5 (other clashes become immediately visible) Informatics UoE Knowledge Engineering 291 Informatics UoE Knowledge Engineering 292

  3. Knowledge Evolution Knowledge Evolution Truth Maintenance Systems JTMS Truth Maintenance Systems JTMS Knowledge in Learning ATMS Knowledge in Learning ATMS Summary Summary Example Example 1. Create cross-product (all pairwise combinations) of assumption sets of A and B : ◮ Suppose we have assumptions a 1 to a 5 and sentences A and {{ a 1 , a 2 } , { a 1 , a 2 , a 3 } , { a 1 , a 2 , a 4 } , { a 1 , a 2 , a 5 } , { a 2 , a 3 , a 5 } , { a 2 , a 4 , a 5 }} B with the following assumption sets: 2. Remove those the contain superfluous assumptions: ◮ A : {{ a 1 , a 2 } , { a 2 , a 5 }} {{ a 1 , a 2 } , { a 2 , a 3 , a 5 }} ◮ B : {{ a 1 } , { a 2 , a 3 } , { a 4 }} 3. If a label exists for C already, take union of the two labels and ◮ “ false : {{ a 4 , a 5 }} ” indicates that a 4 and a 5 contradict each delete redundant assumptions (no contradiction testing other necessary) ◮ Assume we are adding new sentence A ∧ B ⇒ C , what is the 4. If label for C changed, propagate changes to those sentences correct set of assumptions? whose labels depend on C 5. If all labels of C contain contradictions, add these to the label of “false” (and delete those members or supersets thereof from all other nodes) Informatics UoE Knowledge Engineering 293 Informatics UoE Knowledge Engineering 294 Knowledge Evolution Knowledge Evolution Truth Maintenance Systems Truth Maintenance Systems Explanation-Based Learning Explanation-Based Learning Knowledge in Learning Knowledge in Learning Summary Summary Knowledge in Learning – EBL Explanation-Based Learning ◮ In our account of inductive learning (decision trees, version spaces) we didn’t make use of prior knowledge ◮ Basic advantage of using prior knowledge: narrowing down ◮ Intuition: Explaining why something is a good idea is much the hypothesis space easier than coming up with the idea in the first place ◮ Entailment constraint of pure inductive learning: ◮ Two-step process: Hypothesis ∧ Descriptions | = Classification 1. Construct an explanation of the observation using prior ◮ Entailment constraint with background knowledge in knowledge explanation-based learning (EBL): 2. Establish a definition of the class of cases for which Hypothesis ∧ Descriptions | = Classification explanation can be used Background | = Hypothesis ◮ Crucial step: to identify the necessary condition for the steps ◮ Agent could have derived hypothesis from background used in explanation to apply to another case knowledge (instance does not add anything factually new) ◮ However, EBL is a useful method to derive special-purpose knowledge from first-principle theories Informatics UoE Knowledge Engineering 295 Informatics UoE Knowledge Engineering 296

Recommend


More recommend