semlink framenet verbnet and event ontologies
play

SemLink+: FrameNet, VerbNet, and Event Ontologies Martha Palmer, - PowerPoint PPT Presentation

SemLink+: FrameNet, VerbNet, and Event Ontologies Martha Palmer, Claire Bonial, Diana McCarthy University of Colorado Frame Semantics in NLP: A Workshop in Honor of Chuck Fillmore (1929 2014) ACL Workshop June 27, 2014 1 Outline n


  1. SemLink+: FrameNet, VerbNet, and Event Ontologies Martha Palmer, Claire Bonial, Diana McCarthy University of Colorado Frame Semantics in NLP: A Workshop in Honor of Chuck Fillmore (1929 – 2014) ACL Workshop June 27, 2014 1

  2. Outline n Deep NLU? n Where we are now n Where we need to go n More details about where we need to go n The contributions and limitations of lexical resources to this process 2

  3. Where we are now – shallow semantics n Syntactic Structure – parse trees, Treebanks n Semantic types – nominal entities [Person, Location, Organization], NE tagging n Semantic roles – Agents, [PropBank FrameNet, VerbNet] n Sense distinctions – call me a taxi, call me an idiot , WordNet, OntoNotes groups, FrameNet, VerbNet, vectors, etc. n Coreference – [President Obama: he] 3

  4. Where we are now - DETAILS n DARPA-GALE, OntoNotes 5.0 q BBN, Brandeis, Colorado, Penn q Multilayer structure: NE, TB, PB, WS, Coref q Three languages: English, Arabic, Chinese q Several Genres (@ ≥ 200K ): NW, BN, BC, WT n Close to 2M words @ language (less PB for Arabic) q Parallel data, E/C, E/A n DARPA BOLT – discussion forum, SMS q PropBank extensions: light verbs, function tags on core args, nominalizations, adjectives, constructions, often relying on FrameNet 4

  5. PropBank Verb Frames Coverage 100% n The set of verbs is open 99% n But the distribution is highly skewed 98% n For English, the 1000 97% most frequent lemmas cover 95% of the verbs 96% in running text. Graphs show counts over q 95% English Web data containing 150 M verbs. 94% 1000 2000 3000 4000 5000 6000 7000 8000 FrameNet and VerbNet should have the same coverage, and we (or at least VerbNet) desperately need help to do this semi-automatically!! 5

  6. WordNet: - call, 28 senses, 9 groups WN5, WN16,WN12 WN15 WN26 Loud cry Bird or animal cry WN3 WN19 WN4 WN 7 WN8 WN9 Request WN1 WN22 Label WN20 WN25 Call a loan/bond WN18 WN27 Challenge WN2 WN 13 WN6 WN23 Visit Phone/radio WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Bid

  7. SEMLINK-PropBank, VerbNet, FrameNet, WordNet, OntoNotes Palmer, Dang & Fellbaum, NLE 2007 PropBank cost-54.2, ON2 Frameset1* fit-54.3, ON3 carry WN1 WN2 WN5 WN20 WN22 WN24 WN24 WN31 WN33 WN34 WN1 WN3 WN8 WN11 WN 23 WN9 WN16 WN17 WN19 WN27 WN37 WN38 WN28 WN32 WN35 WN36 ON4 – win election carry-11.4, CARRY,-FN ,ON1 *ON5-ON11 carry oneself,carried away/out/off, carry to term

  8. Sense Hierarchy n PropBank Framesets – ITA >90% coarse grained distinctions 20 Senseval2 verbs w/ > 1 Frameset Maxent WSD system, 73.5% baseline, 90% q Sense Groups (Senseval-2/OntoNotes) - ITA 89% Intermediate level (includes Verbnet/some FrameNet) – SVM, 88+% Dligach & Palmer, ACL2011 n WordNet – ITA 73% fine grained distinctions, 64% 8

  9. SEMLINK n Extended VerbNet: 6,340 senses n 92% PB tokens (8114 verb senses/12,646 all) n Type-type mapping PB/VN, VN/FN, VN/WN n Semi-automatic mapping of WSJ PropBank instances to VerbNet classes and thematic roles, hand-corrected. (now FrameNet also) n VerbNet class tagging as automatic WSD Brown, Dligach, Palmer, IWCS 2011; Croce, et. al., ACL2012 n Run SRL, map Arg2 to VerbNet roles, Brown Yi, Loper, Palmer, NAACL07 performance improves 9

  10. Where we need to go – Richer Event Descriptions - RED n “ Saucedo said that guerrillas in one car opened fire on police standing guard, while a second car carrying 88 pounds (40 kgs) of dynamite parked in front of the building, and a third car rushed the attackers away.” n Saucedo said – reporting event, evidential 10

  11. What we can do n that guerrillas in one car opened fire on police standing guard n opened fire = aspectual context, q fire(guerillas, police) n standing guard = support verb construction/ aspectual?, reduced relative q guard(police, X) 11

  12. What we can do, cont. n while a second car carrying 88 pounds (40 kgs) of dynamite parked in front of the building n carrying - reduced relative, correct head noun - pounds or dynamite? q carry(car2, dynamite) n park(car2, front_of(building)) 12

  13. What we can do, cont. n and a third car rushed the attackers away n rush(car3, attackers, away) 13

  14. Temporal & Causal ordering? n “ Saucedo said that guerrillas in one car opened fire on police standing guard, while a second car carrying 88 pounds (40 kgs) of dynamite parked in front of the building, and a third car rushed the attackers away n guarding BEFORE/OVERLAP firing n Narrative container – TimeX q [firing, parking, rushing] all overlap, all in the same temporal bucket? q [see Styler, et. al, ACL2014, Events Workshop & RED Guidelines] 14

  15. Don’t mark the relations between EVENT s. ! � Instead, put EVENT s in temporal buckets and relate the buckets

  16. Temporal & Causal ordering n “ Saucedo said that guerrillas in one car opened fire on police standing guard, while a second car carrying 88 pounds (40 kgs) of dynamite parked in front of the building, and a third car rushed the attackers away q guarding BEFORE/OVERLAP firing q X CONTAINS [firing, parking, rushing] q firing BEFORE parking q parking BEFORE rushed 17

  17. Implicit arguments n that guerrillas in one car opened fire on police standing guard n opened fire = aspectual context, q fire(guerillas, police) n standing guard = support verb construction or aspectual?, reduced relative q guard(police, X) 18

  18. More compelling example (thanks to Vivek Srikumar) n The bomb exploded in a crowded marketplace. Five civilians were killed, including two children. Al Qaeda claimed responsibility. n Killed by Whom? n Responsibility for what? n Need recovery of implicit arguments 19

  19. VerbNet – based on Levin, B.,93 Kipper, et. al., LRE08 n Class entries: q Capture generalizations about verb behavior q Organized hierarchically q Members have common semantic elements, semantic roles, syntactic frames, predicates n Verb entries: q Refer to a set of classes (different senses) q each class member linked to WN synset(s), ON groupings, PB frame files, FrameNet frames, 20

  20. VerbNet: send-11.1 (Members: 11, Frames: 5) includes “ ship ” n Roles n Agent [+animate | +organization] n Theme [+concrete] n Source [+location] n Destination [+animate | [+location & -region]] n Syntactic Frame:NP V NP PP.destination q example " Nora sent the book to London ." q syntax Agent V Theme {to} Destination q semantics motion(during(E), Theme) location(end(E), Theme, Destination) cause(Agent, E) 21

  21. Recovering Implicit Arguments* [Palmer, et. al., 1986; Gerber & Chai, 2010, 2012; Ruppenhofer, Sporleder, Morante, Baker, Palmer. 2010. SemevEval-2010 Task10:] * AKA definite null complements [ Arg0 The two companies] [ REL1 produce] [ Arg1 market pulp, containerboard and white paper]. The goods could be manufactured closer to customers, saving [ REL2 shipping ] costs. n Used VerbNet for subcategorization frames 22

  22. Implicit arguments n SYNTAX Agent V Theme {to} Destination [AGENT] shipped [THEME] to [DESTINATION] n SEMANTICS q CAUSE(AGENT,E) q MOTION(DURING(E), THEME), q LOCATION(END(E), THEME, DESTINATION), 23

  23. Implicit arguments instantiated using coreference n [AGENT] shipped [THEME] to [DESTINATION] n [Companies] shipped [goods] to [customers]. n SEMANTICS q CAUSE(Companies, E) q MOTION(DURING(E), goods), q LOCATION(END(E), goods, customers), Can annotate, semi-automatically! 24

  24. Another type of Implicit Relation Example from Daniel Marcu, GALE Wrap-up Mtg n Between Munich and LA you need less than 11 hours by plane. n You can fly to Los Angeles from Munchen in no more than eleven hours. n From Munich to Los Angeles, it does not take more than eleven hours by plane.

  25. Constructions allow us to n Recognize a path prepositional phrase, and that it necessarily goes with a “MOTION” event – Caused-motion constructions q John sneezed the tissue off the table. q Mary blinked the snow off of her eyelashes. n If we detect a MOTION event we can associate the plane with it as a vehicle n Just the plane itself can suggest a motion event… 26

  26. Construction Grammar n In Construction Grammar (Fillmore, 1988, Goldberg, 1995, Kay and Fillmore, 1999, Michaelis, 2004, Goldberg, 2005) q constructions are carriers of meaning q constructions are assigned meaning in the same way that words are – via convention rather than composition. n Invaluable resource – FrameNet Constructicon, Cxn Viewer 27

  27. Introducing a Constructional Layer to VerbNet Jena Hwang, LREC-2014 n Introduce a constructional ``layer ” to VerbNet, which attaches orthogonally to relevant VerbNet classes Constructional Layer Caused Motion Resultative He blinked the snow off his The pond froze eyelashes. solid. He blinked his eyes They hissed him out of the dry. university . Current VN hiccup-40.1.1 weather-57 manner_speaking-37.3 2 47 8

  28. VerbNet can also provide inferences – sometimes… q Every path from back door to yard was covered by a grape-arbor, and every yard had fruit trees. q Where are the grape arbors located? 29

Recommend


More recommend