deriving multi headed planar dependency parses from link
play

Deriving Multi-Headed Planar Dependency Parses from Link Grammar - PowerPoint PPT Presentation

Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Deriving Multi-Headed Planar Dependency Parses from Link Grammar Parses Juneki Hong and Jason Eisner 1 / 36 Introduction Motivation, Overview


  1. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Deriving Multi-Headed Planar Dependency Parses from Link Grammar Parses Juneki Hong and Jason Eisner 1 / 36

  2. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Introduction ◮ This talk is about converting from one annotation style to another. ◮ The conversion could be hard, where information is fragmented, missing, or ambiguous. ◮ We use a general technique, Integer Linear Programming to help us do this conversion. 2 / 36

  3. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions In Our Case: What We Started With W WV X I E D S E P MV J - n-u v e e v v-d r n-u - the matter may never even be tried in court . Link Grammar: Parse with undirected edges 3 / 36

  4. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions What We Wanted: W WV X I E D S E P MV J - n-u v e e v v-d r n-u - the matter may never even be tried in court . Multiheaded parse with directionalized edges 4 / 36

  5. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Why We Wanted That ◮ We want to develop parsing algorithms for parses that look like this ◮ We couldn’t figure out where to get the data to test them. 5 / 36

  6. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Single-headedness ◮ Dependency parse treebanks today are either single-headed or not planar. ◮ Stanford Dependencies are multiheaded but not planar ROOT P VC ADV NMOD SBJ ADV VC ADV PMOD DT NN MD RB RB VB VB IN NN . the matter may never even be tried in court . Some example dependency parse. 6 / 36

  7. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Single-headedness ◮ Dependency parse treebanks today are either single-headed or not planar. ◮ Stanford Dependencies are multiheaded but not planar ROOT P VC ADV NMOD SBJ ADV VC ADV PMOD DT NN MD RB RB VB VB IN NN . the matter may never even be tried in court . Some example dependency parse. Link Grammar is almost a multiheaded planar corpora! We just need to directionalize the links. 6 / 36

  8. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Why Multi-headedness? Multi-headedness Can Capture Additional Linguistic Phenomenon ◮ Control ◮ Relativization ◮ Conjunction 7 / 36

  9. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Control, Relativization, Conjunction Control Jill likes to skip Jill is the subject of two verbs Jill persuaded Jack to skip Jack is the object of one verb and the subject of another 8 / 36

  10. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Control, Relativization, Conjunction Relativization The boy that Jill skipped with fell down The boy is the object of with as well as the subject of fell . 9 / 36

  11. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Control, Relativization, Conjunction Conjunction Jack and Jill went up the hill Jack and Jill serve as the two arguments to and , but are also subjects of went . 10 / 36

  12. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Motivation Motivation ◮ A multiheaded dependency corpus would be useful for testing new parsing algorithms 11 / 36

  13. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Motivation Motivation ◮ A multiheaded dependency corpus would be useful for testing new parsing algorithms ◮ Such a corpus could be automatically annotated using Integer Linear Programming 11 / 36

  14. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Motivation Motivation ◮ A multiheaded dependency corpus would be useful for testing new parsing algorithms ◮ Such a corpus could be automatically annotated using Integer Linear Programming ◮ We explored whether the Link Grammar could be adapted for this purpose. 11 / 36

  15. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Motivation Motivation ◮ A multiheaded dependency corpus would be useful for testing new parsing algorithms ◮ Such a corpus could be automatically annotated using Integer Linear Programming ◮ We explored whether the Link Grammar could be adapted for this purpose. ◮ The results of this are mixed, but provides a good case study. 11 / 36

  16. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Corpus Building Corpus Building Strategy ◮ We start with some sentences and parse them with LG Parser ◮ We take the undirected parses and try to directionalize them. ◮ We use an ILP to assign consistent directions for each link type. 12 / 36

  17. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Grammar-based formalism for projective dependency parsing with undirected links Original formalism and English Link Grammar created by Davy Temperley, Daniel Sleator, and John Lafferty (1991) 13 / 36

  18. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars: How They Work 1 1 These figures were clipped from the original Link Grammar paper: “Parsing English with a Link Grammar” by Sleator and Temperley 14 / 36

  19. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars: How They Work 15 / 36

  20. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars: How They Work 16 / 36

  21. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars: Same Example Parse From Before Again W WV X I E D S E P MV J - n-u v e e v v-d r n-u - the matter may never even be tried in court . Link Parse of a sentence from Penn Tree Bank 17 / 36

  22. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars Compare resulting dependency parse with CoNLL 2007 shared task. W WV X I E D S E P MV J - n-u v e e v v-d r n-u - the matter may never even be tried in court . DT NN MD RB RB VB VB IN NN . NMOD SBJ ADV VC ADV PMOD ADV VC P ROOT Bottom half is CoNLL. Top half is the directionalized link parse. 18 / 36

  23. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions Link Grammars Intro Link Grammars Compare resulting dependency parse with CoNLL 2007 shared task. W WV X I E D S E P MV J - n-u v e e v v-d r n-u - the matter may never even be tried in court . DT NN MD RB RB VB VB IN NN . NMOD SBJ ADV VC ADV PMOD ADV VC P ROOT Bottom half is CoNLL. Top half is the directionalized link parse. 19 / 36

  24. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions What is ILP? What is Integer Linear Programming? ◮ An optimization problem where some or all of the variables are integers. ◮ The objective function and constraints are linear. ◮ In general, it’s NP-Hard! But good solvers exist that work well most of the time. ◮ Our ILP is encoded as a ZIMPL program and solved using the SCIP Optimization Suite 2 2 http://scip.zib.de/ 20 / 36

  25. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions ILP Model Integer Linear Programming Model Encoded Constraints: ◮ Acyclicity ◮ Connectedness ◮ Consistency of Directionalized Links 21 / 36

  26. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions ILP Model Integer Linear Programming Model Encoded Constraints: ◮ Acyclicity: ( No cycles! ) ◮ Connectedness ◮ Consistency of Directionalized Links 21 / 36

  27. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions ILP Model Integer Linear Programming Model Encoded Constraints: ◮ Acyclicity: ( No cycles! ) ◮ Connectedness: ( Every word is reachable from a root ) ◮ Consistency of Directionalized Links 21 / 36

  28. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions ILP Model Integer Linear Programming Model Encoded Constraints: ◮ Acyclicity: ( No cycles! ) ◮ Connectedness: ( Every word is reachable from a root ) ◮ Consistency of Directionalized Links: ( Similar links oriented the same way ) 21 / 36

  29. Introduction Motivation, Overview Link Grammars ILP Model Experiments and Results Conclusions ILP Model Integer Linear Programming Model For each sentence, for each edge i , j , where i < j L . . . i . . . j . . . Variables: x ij , x ji ∈ Z ≥ 0: orientation of each link x ij + x ji = 1 22 / 36

Recommend


More recommend