naturalli natural logic inference for common sense
play

NaturalLI: Natural Logic Inference for Common Sense Reasoning - PowerPoint PPT Presentation

NaturalLI: Natural Logic Inference for Common Sense Reasoning Angeli & Manning (2014) (MacCartney 2007) Introduction: Motivation examples Natural Logic: Lexical Relations Monotonicity and Polarity Proof by alignment


  1. NaturalLI: Natural Logic Inference for Common Sense Reasoning Angeli & Manning (2014) (MacCartney 2007)

  2. ● Introduction: Motivation examples ● Natural Logic: ○ Lexical Relations ○ Monotonicity and Polarity ○ Proof by alignment ● Inference as Search ● Results ● Discussion

  3. Natural Language Inference (NLI) : Recognizing textual entailment Does premise P justify an inference to hypothesis H? P Every firm polled saw costs grow more than expected, even after adjusting inflation. H Every big company in the poll reported cost increases. What if we change the quantifiers to Some ? YES

  4. Does premise P justify an inference to hypothesis H? P The cat ate a mouse H No carnivores eat animals NO Natural Language Inference is necessary to the ultimate goal of full Natural Language understanding. (also enable semantic search, questions answering,)

  5. Approached solutions: Natural Logic Intermediate representation NLP on text First-order logic Surface form of the Theorem proving. text. Intractable We need logical unnatural language! subtlety

  6. What is Natural Logic? If I mutate a sentence in this specified way, do I preserve its truth? A logic whose vehicle of inference is natural language (Lakoff, 1970) Instantaneous semantic parsing! Characterizes valid patterns of inference in terms of surface forms, it enables to do precise reasoning avoiding the difficulties of fuel semantic interpretation. ● Influenced in traditional logic: Aristotle’s syllogisms. Syllogistic reasoning. ● Monotonicity calculus. (Sanchez, Valencia 1986-91) ● McCartney's Natural Logic. Extends monotonicity calculus to account for negation and exclusion

  7. Basic entailment lexical relations couch sofa (exhaustive exclusion) crow bird utahn american human nonhuman # (independence) (exhaustive non-exclusion) (non-exhaustive exclusion) Cat # friendly animal nonhuman cat dog

  8. Relations are defined for all semantic types: tiny small dance move this morning today this morning today in Beijing in China everyone someone all most some

  9. Small example eat apple apple fruit eat fruit

  10. Entailment and semantic composition How the entailments of a compound expression depend on the entailments of its parts? ● Typically, semantic composition preserves entailment relations: some cats some animals eat apple eat fruit, big bird big fish, ● But many semantic functions behave differently: refuse to tango refuse to dance tango dance european african not european not african

  11. Polarity Polarity is the direction a lexical item can move in the ordering Hypernym as a partial order

  12. Polarity Quantifiers determines the polarity of words

  13. Polarity Quantifiers determines the polarity of words

  14. Polarity Quantifiers determines the polarity of words

  15. Polarity Quantifiers determines the polarity of words

  16. Polarity Quantifiers determines the polarity of words

  17. Polarity Quantifiers determines the polarity of words

  18. Polarity Quantifiers determines the polarity of words

  19. Projecting relations induced by lexical mutations Projection function. Two sentences differing only by a single lexical relation (downward) Join table. Two projected relations for composition

  20. Projection examples cat no dog dog no cat animal nonhuman failed to be animal failed to be nonhuman no cats eat mice no animal eat mice cat animal human nonhuman fish human fish nonhuman cat dog feline feline #dog cat feline dog cat cat feline dog

  21. Proof by alignment 1. Find sequence of edits connecting P and H. Insertions, deletions, substitution 2. Determine lexical entailment relation for each edit ● Substitutions: depends on meaning of substituends: cat dog ● Deletions: by default: dark chocolate chocolate ● But some deletions are special: not ill ill, refuse to go go ● Insertion are symmetric to deletions: by default 3. Project up to find entailment relation across each edit 4. Join entailment relations across sequence of edits

  22. Example: P Stimpy is a cat H Stimpy is not a poodle i Mutation r s Stimpy is a cat Stimpy is not a poodle

  23. A more complex example

  24. Common Sense Reasoning with Natural Logic Task: Given an utterance, and a large knowledge base of supporting facts. We want to know if the utterance is true or false.

  25. Common Sense Reasoning for NLP

  26. Common Sense Reasoning for Vision

  27. Start with a (large) Knowledge Base >> Infer new facts

  28. Infer new facts, on demand from a query

  29. Using text as the meaning representation

  30. Without aligning to any particular premise

  31. Natural Logic inference is search

  32. Example search as graph search

  33. Example search as graph search

  34. Example search as graph search

  35. Example search as graph search

  36. Example search as graph search

  37. Example search as graph search

  38. Edges of the graph

  39. Edge templates

  40. “Soft” Natural Logic Likely (but not certain) inferences ● Each edge has a cost >=0 Detail: Variation among edge instances of a template. ● WordNet: ● Nearest neighbors distance. ● Most other cases distance is 1. ● Let us call this edge distance f.

  41. Experiments ● Knowledge base: 270 millions unique lemmatized premises as database (Ollie extractions: short canonical utterances. Wikipedia) ● Evaluation set: Semi-curated collection of common-sense (true) facts. ● Negatives: Mechanical Turk ● Size: 1378 Train, 1080 Test

  42. Results

  43. References Some of the material for these slides was also extracted from the following links: Modeling Semantic Containment and Exclusion in Natural Language Inference. Bill MacCartney 2008: https://slideplayer.com/slide/5095504/ NatutalLI. G. Agneli 2014: https://cs.stanford.edu/~angeli/talks/2014-emnlp-naturalli.pdf

  44. Equations Surface form and validity to a new fact is the normalized frequency a word in Google N-gram corpus Neural Network embeddings Huang et al. Log likelihood of data D, subject to cost, Objective function, negative log likelihood, with L2 regularization,

Recommend


More recommend