Compositional Semantics CMSC 723 / LING 723 / INST 725 M ARINE C ARPUAT marine@cs.umd.edu
• Words, bag of words • Sequences • Trees • Meaning
Representing Meaning • An important goal of NLP/AI: convert natural language into a representation that supports semantic inferences • Why? Many applications require semantic understanding • Question answering, translation, fact-checking, giving instructions to a robot,… • Challenge: how to bridge gap between linguistic input to non-linguistic knowledge of the world
Representing Meaning Challenges for mapping linguistic input to meaning • different words/structure, same meaning – She needed to make a quick decision in that situation. – The scenario required her to make a split-second judgment. – I saw the man. – The man was seen by me.
Representing Meaning Challenges for mapping linguistic input to meaning • same words, different meaning - I walked by the bank - … to deposit my check. - … to take a look at the river. – Everyone on the island speaks two languages. – Two languages are spoken by everyone on the island.
Representing Meaning • create representations of linguistic inputs that capture the meanings of those inputs. • In most cases, they’re simultaneously descriptions – of the meanings of utterances – and of some potential state of affairs in some world.
Desired Properties of Meaning Representations • Goal: express propositions, while abstracting away from ambiguity/vagueness of natural language • Desired Properties – Verifiability – No ambiguity – Expressiveness – Inference
Natural Language Inferences Examples • All blips are foos. • Mozart was born in Salzburg. • Blop is a blip. • Mozart was born in • Blop is a foo. Vienna. • No, that can’t be. These are different cities.
We’ll cover different families of approaches • Logical Semantics • Shallow Representations and Lexical Semantics • Textual Inference
Constrasting 2 Strategies to Semantic Analysis • Logical semantics – Complete analysis – Create a First Order Logic representation that accounts for all the entities, roles and relations present in a sentence • Information Extraction – Superficial analysis – Pulls out only the entities, relations and roles that are of interest to the consuming application.
Information Extraction: Entity Recognition PERSON ORGANIZATION American Airlines , a unit of AMR, immediately matched the move, spokesman Tim Wagner said.
Information Extraction: Predicting Relations Founder? Investor? Member? Employee? PERSON ORGANIZATION President? American Airlines , a unit of AMR, immediately matched the move, spokesman Tim Wagner said.
Information Extraction Relations PERSON- GENERAL PART- PHYSICAL SOCIAL AFFILIATION WHOLE Subsidiary Lasting Citizen- Family Near Personal Geographical Resident- Located Ethnicity- Org-Location- Business Religion Origin ORG ARTIFACT AFFILIATION Investor Founder Student-Alum User-Owner-Inventor- Ownership Employment Manufacturer Membership Sports-Affiliation 17 relations from 2008 “Relation Extraction Task” from Automated Content Extraction (ACE)
Information Extraction Relations • UMLS: Unified Medical Language System 134 entity types, 54 relations Injury disrupts Physiological Function Bodily Location location-of Biologic Function Anatomical Structure part-of Organism Pharmacologic Substance causes Pathological Function Pharmacologic Substance treats Pathologic Function
Building Blocks of Logical Representations of Meaning Propositional Semantics • Proposition symbols: P , Q, … • Boolean operators – negation, conjunction, disjunction – Implication, equivalence • Inference rules – Can be defined using Boolean connectives P => Q
Building Blocks of Logical Representations of Meaning Predicate Logic : extends our representation with • Constants = elements that name entities in the model • Predicates = sets of objects or, equivalently, functions from objects to truth values • Functions = sets of pairs of objects, or eq. functions from one object to another
Building Blocks of Logical Representations of Meaning Predicate Logic : extends our representation further with • Variables = let us refer to objects which are not locally specified • Quantifiers = used to bind variables – Existential – Universal
A CFG specification of the syntax of First Order Logic Representations From SLP2 Section 17.3
Representing a sentence in FOL • Franco likes Frasca. How can we represent the “Liking” predicate - argument template?
Predicate-Argument Structure in Natural Language • Events, actions and relationships can be captured with representations that consist of predicates and arguments to those predicates. • Predicates – Primarily Verbs, VPs, Sentences – Sometimes Nouns and NPs • Arguments – Primarily Nouns, Nominals, NPs, PPs – But also everything else, depends on the context
Example: representing predicate- argument structure… • Mary gave a list to John. • Giving(Mary, John, List) • More precisely – Gave conveys a three-argument predicate – The first argument is the subject – The second is the recipient, which is conveyed by the NP inside the PP – The third argument is the thing given, conveyed by the direct object
Example: representing predicate- argument structure • Predicate-argument structures as templates – We can think of the verb/VP providing a template like the following e , x , y , zGiving ( e )^ Giver ( e , x )^ Given ( e , y )^ Givee ( e , z ) – The semantics of the NPs and the PPs in the sentence plug into the slots provided in the template
A CFG specification of the syntax of First Order Logic Representations From SLP2 Section 17.3
Representing a sentence in FOL • Franco likes Frasca. “Liking” predicate -argument template
One More Building Block of Logical Representations of Meaning • Lambda forms λx.P(x) – Take a FOL formula with variables in it that are to be bound. – Allow those variables to be bound by treating λx.P(x)(Fr anco) the lambda form as a function with formal P(Franco) arguments.
Lambda Reductions
Logical Semantics Representations of Natural Language • Building blocks – Propositional Logic – Predicate Logic – Lambda Forms • Given a sentence, how can we construct its logical representation? – One approach: compositional semantics
Compositional Analysis: use syntax to guide semantic analysis
Principle of Compositionality • The meaning of a whole is derived from the meanings of the parts • What parts? – The constituents of the syntactic parse of the input • What could it mean for a part to have a meaning?
Compositional Analysis: use syntax to guide semantic analysis
Augmented Rules • We’ll accomplish this by attaching semantic formation rules to our syntactic CFG rules • Abstractly α .sem,... α A ... { f ( .sem )} 1 n 1 n – This should be read as: “the semantics we attach to A can be computed from some function applied to the semantics of A’s parts.”
Example • Attachments • Easy parts… {PropNoun.sem} – NP -> PropNoun {Frasca} – PropNoun -> Frasca {Franco} – PropNoun -> Franco
Example • S -> NP VP • {VP .sem(NP .sem)} • VP -> Verb NP • {Verb.sem(NP .sem) • Verb -> likes • ???
Which approach can we use to… discover information about specific entities?
What approach can we use to… summarize text?
Which approach can we use to… query databases?
Which approach can we use to… instruct a robot?
Recap… Intro to Semantics – Meaning representations • motivated by semantic processing • for specific applications – 2 approaches to semantic processing • complete FOL representation • vs. shallow information extraction
Recommend
More recommend