NaturalLI: Natural Logic Inference for Common Sense Reasoning Angeli & Manning (2014) (MacCartney 2007)
● Introduction: Motivation examples ● Natural Logic: ○ Lexical Relations ○ Monotonicity and Polarity ○ Proof by alignment ● Inference as Search ● Results ● Discussion
Natural Language Inference (NLI) : Recognizing textual entailment Does premise P justify an inference to hypothesis H? P Every firm polled saw costs grow more than expected, even after adjusting inflation. H Every big company in the poll reported cost increases. What if we change the quantifiers to Some ? YES
Does premise P justify an inference to hypothesis H? P The cat ate a mouse H No carnivores eat animals NO Natural Language Inference is necessary to the ultimate goal of full Natural Language understanding. (also enable semantic search, questions answering,)
Approached solutions: Natural Logic Intermediate representation NLP on text First-order logic Surface form of the Theorem proving. text. Intractable We need logical unnatural language! subtlety
What is Natural Logic? If I mutate a sentence in this specified way, do I preserve its truth? A logic whose vehicle of inference is natural language (Lakoff, 1970) Instantaneous semantic parsing! Characterizes valid patterns of inference in terms of surface forms, it enables to do precise reasoning avoiding the difficulties of fuel semantic interpretation. ● Influenced in traditional logic: Aristotle’s syllogisms. Syllogistic reasoning. ● Monotonicity calculus. (Sanchez, Valencia 1986-91) ● McCartney's Natural Logic. Extends monotonicity calculus to account for negation and exclusion
Basic entailment lexical relations couch sofa (exhaustive exclusion) crow bird utahn american human nonhuman # (independence) (exhaustive non-exclusion) (non-exhaustive exclusion) Cat # friendly animal nonhuman cat dog
Relations are defined for all semantic types: tiny small dance move this morning today this morning today in Beijing in China everyone someone all most some
Small example eat apple apple fruit eat fruit
Entailment and semantic composition How the entailments of a compound expression depend on the entailments of its parts? ● Typically, semantic composition preserves entailment relations: some cats some animals eat apple eat fruit, big bird big fish, ● But many semantic functions behave differently: refuse to tango refuse to dance tango dance european african not european not african
Polarity Polarity is the direction a lexical item can move in the ordering Hypernym as a partial order
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Polarity Quantifiers determines the polarity of words
Projecting relations induced by lexical mutations Projection function. Two sentences differing only by a single lexical relation (downward) Join table. Two projected relations for composition
Projection examples cat no dog dog no cat animal nonhuman failed to be animal failed to be nonhuman no cats eat mice no animal eat mice cat animal human nonhuman fish human fish nonhuman cat dog feline feline #dog cat feline dog cat cat feline dog
Proof by alignment 1. Find sequence of edits connecting P and H. Insertions, deletions, substitution 2. Determine lexical entailment relation for each edit ● Substitutions: depends on meaning of substituends: cat dog ● Deletions: by default: dark chocolate chocolate ● But some deletions are special: not ill ill, refuse to go go ● Insertion are symmetric to deletions: by default 3. Project up to find entailment relation across each edit 4. Join entailment relations across sequence of edits
Example: P Stimpy is a cat H Stimpy is not a poodle i Mutation r s Stimpy is a cat Stimpy is not a poodle
A more complex example
Common Sense Reasoning with Natural Logic Task: Given an utterance, and a large knowledge base of supporting facts. We want to know if the utterance is true or false.
Common Sense Reasoning for NLP
Common Sense Reasoning for Vision
Start with a (large) Knowledge Base >> Infer new facts
Infer new facts, on demand from a query
Using text as the meaning representation
Without aligning to any particular premise
Natural Logic inference is search
Example search as graph search
Example search as graph search
Example search as graph search
Example search as graph search
Example search as graph search
Example search as graph search
Edges of the graph
Edge templates
“Soft” Natural Logic Likely (but not certain) inferences ● Each edge has a cost >=0 Detail: Variation among edge instances of a template. ● WordNet: ● Nearest neighbors distance. ● Most other cases distance is 1. ● Let us call this edge distance f.
Experiments ● Knowledge base: 270 millions unique lemmatized premises as database (Ollie extractions: short canonical utterances. Wikipedia) ● Evaluation set: Semi-curated collection of common-sense (true) facts. ● Negatives: Mechanical Turk ● Size: 1378 Train, 1080 Test
Results
References Some of the material for these slides was also extracted from the following links: Modeling Semantic Containment and Exclusion in Natural Language Inference. Bill MacCartney 2008: https://slideplayer.com/slide/5095504/ NatutalLI. G. Agneli 2014: https://cs.stanford.edu/~angeli/talks/2014-emnlp-naturalli.pdf
Equations Surface form and validity to a new fact is the normalized frequency a word in Google N-gram corpus Neural Network embeddings Huang et al. Log likelihood of data D, subject to cost, Objective function, negative log likelihood, with L2 regularization,
Recommend
More recommend