parsing to stanford dependencies trade offs between speed
play

Parsing to Stanford Dependencies: Trade-offs between speed and - PowerPoint PPT Presentation

Parsing to Stanford Dependencies: Trade-offs between speed and accuracy Daniel Cer, Marie-Catherine de Marneffe Daniel Jurafsky, Christopher D. Manning Stanford Dependencies Overview About the Representation Widely used


  1. Parsing to Stanford Dependencies: Trade-offs between speed and accuracy Daniel Cer, Marie-Catherine de Marneffe Daniel Jurafsky, Christopher D. Manning

  2. Stanford Dependencies Overview About the Representation Widely used Semantically-oriented Slow to extract Extraction Bottleneck Stanford lexicalized phrase structure parser Are There Faster and Better Approaches? Dependency parsing algorithms Alternate phrase structure parsers

  3. Stanford Dependencies Organization Brief Review of Stanford Dependencies Properties Extraction pipeline Experiments Comparing Parsing Approaches Dependency Phrase structure MaltParser Berkeley MSTParser Bikel Charniak Search Space Pruning with Charniak-Johnson

  4. Stanford Dependencies What We’ll Show Performing dependency parsing using a phrase structure parser followed by rule based extraction is more accurate and, in some cases, faster then using statistical dependency parsing algorithms.

  5. Stanford Dependencies What We’ll Show Performing dependency parsing using a phrase structure parser followed by rule based extraction is more accurate and, in some cases, faster then using statistical dependency parsing algorithms. For English using the Stanford Dependency formalism

  6. Stanford Dependencies What We’ll Show Performing dependency parsing using a phrase structure parser followed by rule based extraction is more accurate and, in some cases, faster then using statistical dependency parsing algorithms. For English using the Stanford Dependency formalism However, we suspect the results maybe more general

  7. Stanford Dependencies Semantically Oriented Capture relationships between content words Syntactic Dependencies Stanford Dependencies

  8. Stanford Dependencies Basic Dependencies Start out by extracting syntactic heads Results in a projective dependency tree

  9. Stanford Dependencies Basic Dependencies Start out by extracting syntactic heads Results in a projective dependency tree

  10. Stanford Dependencies Collapsed Dependencies Bills on ports and immigration

  11. Stanford Dependencies Collapsed Dependencies Bills on ports and immigration

  12. Stanford Dependencies Obtaining the Dependencies Standard Pipeline Phrase Structure Parser Sentence

  13. Stanford Dependencies Obtaining the Dependencies Standard Pipeline Phrase Structure Parser Sentence Constituent Parse Tree

  14. Stanford Dependencies Obtaining the Dependencies Standard Pipeline Phrase Structure Parser Sentence Constituent Parse Tree Projective Basic Dependencies

  15. Stanford Dependencies Obtaining the Dependencies Standard Pipeline Phrase Structure Parser Sentence Constituent Parse Tree Projective Basic Dependencies Collapsed Dependencies

  16. Stanford Dependencies Obtaining the Dependencies Standard Pipeline Phrase Structure Parser Sentence Constituent Parse Tree Projective Basic Dependencies Collapsed Dependencies

  17. Stanford Dependencies Obtaining the Dependencies Standard Direct Pipeline Pipeline Phrase Structure Parser Dependency parser Sentence Sentence Constituent Parse Tree Projective Basic Dependencies Collapsed Dependencies

  18. Stanford Dependencies Obtaining the Dependencies Standard Direct Pipeline Pipeline Phrase Structure Parser Dependency parser Sentence Sentence Constituent Parse Tree Projective Basic Projective Basic Dependencies Dependencies Collapsed Dependencies

  19. Stanford Dependencies Obtaining the Dependencies Standard Direct Pipeline Pipeline Phrase Structure Parser Dependency parser Sentence Sentence Constituent Parse Tree Projective Basic Projective Basic Dependencies Dependencies Collapsed Dependencies Collapsed Dependencies

  20. Experimental RESULTS BY PIPELINE TYPE

  21. Results by Pipeline Type Method Train Penn Treebank Sections 2 through 21 Test Penn Treebank Section 22 Dependency parsers Malt Parser MSTParser Nivre Eager, Algorithm Eisner Nivre, Covington Factored MIRA Classifier LibLinear, LibSVM RelEx CMU Link grammar parser in Stanford compatibility mod e

  22. Results by Pipeline Type Method Train Penn Treebank Sections 2 through 21 Test Penn Treebank Section 22 Phrase Structure Parsers Charniak Charniak Johnson Reranking Bikel Berkeley Stanford

  23. Results by Pipeline Type Phrase Structure Parser Accuracy 89 Labeled Attachment 88 87 86 F1 85 84 83 82 Charniak Berkeley Bikel Stanford Johnson Best: Charniak Johnson Reranking Parser

  24. Results by Pipeline Type Phrase Structure Parser Speed 3 Sentences/Second 2.5 2 1.5 1 0.5 0 Berkeley Stanford Charniak Bikel Johnson Parse times similar except for Bikel

  25. Results by Pipeline Type Dependency Parser Accuracy Labeled Attachment 81 80 79 78 F1 77 76 75 74 Nivre Eager MSTParser Eisner Nivre Eager LibSVM LibLinear Best: Nivre Eager LibSVM

  26. Results by Pipeline Type Dependency Parser Speed 120 Sentences/Second 100 80 60 40 20 0 Nivre Eager Nivre Eager MSTParser Eisner LibLinear LibSVM Fastest: Nivre Eager LibLinear

  27. Comparison of SPEED AND ACCURACY TRADE-OFFS

  28. Speed and Accuracy Trade-Offs Worst vs. Best Accuracy Worst Phrase Structure Parser vs. Best Dependency Parser

  29. Speed and Accuracy Trade-Offs Worst vs. Best Accuracy Phrase Structure Dependency 86 Labeled Attachment 84 82 F1 80 78 76 Bikel Stanford Nivre Eager MSTParser LibSVM Eisner Worst phrase structure parser better than Best dependency parser

  30. Speed and Accuracy Trade-Offs Worst vs. Best Accuracy Phrase Structure Dependency 86 Labeled Attachment 84 +3 82 F1 80 78 76 Bikel Stanford Nivre Eager MSTParser LibSVM Eisner Worst phrase structure parser better than Best dependency parser

  31. Speed and Accuracy Trade-Offs Best vs. Best Accuracy Best Phrase Structure Parser vs. Best Dependency Parser

  32. Speed and Accuracy Trade-Offs Best vs. Best Accuracy Phrase Structure Dependency 90 Labeled Attachment 88 86 +8 84 F1 82 80 78 76 Charniak Berkeley Nivre Eager MSTParser Johnson LibSVM Eisner Eight point difference between Best phrase structure and Best dependency parser

  33. Speed and Accuracy Trade-Offs Worst vs. Best Speed Worst Dependency Parser vs. Best Phrase Structure Parser

  34. Speed and Accuracy Trade-Offs Worst vs. Best Speed Dependency Phrase Structure 9 Sentences/Second 8 7 6 5 4 3 2 1 0 Nivre Eager MSTParser Berkeley Stanford LibSVM Eisner Worst dependency parser better than Best phrase structure parser

  35. Speed and Accuracy Trade-Offs Worst vs. Best Speed Dependency Phrase Structure 9 Sentences/Second 8 7 6 5 +2 4 3 2 1 0 Nivre Eager MSTParser Berkeley Stanford LibSVM Eisner Worst dependency parser better than Best phrase structure parser

  36. Speed and Accuracy Trade-Offs Best vs. Best Speed Best Dependency Parser vs. Best Phrase Structure Parser

  37. Speed and Accuracy Trade-Offs Best vs. Best Speed Dependency Phrase Structure 120 Sentences/Second 100 80 60 40 20 0 Nivre Eager Nivre Eager Berkeley Stanford LibLinear LibSVM 103 sentences/second difference between Best dependency and Best phrase structure parser

  38. Speed and Accuracy Trade-Offs Best vs. Best Speed Dependency Phrase Structure 120 Sentences/Second 100 80 +103 60 40 20 0 Nivre Eager Nivre Eager Berkeley Stanford LibLinear LibSVM 103 sentences/second difference between Best dependency and Best phrase structure parser

  39. Speed and Accuracy Trade-Offs Out-of-the-Box Summary Accuracy Use Phrase Structure Parsers Best choice: Charniak Johnson Reranking Speed Use Dependency Parsers Best Choice: Nivre Eager* with LibLinear * Actually, any parser in the MaltParser package will do.

  40. Making Use Of CHARNIAK JOHNSON SEARCH SPACE PRUNING

  41. Charniak Johnson Search Space Pruning Example Search Best First Search

  42. Charniak Johnson Search Space Pruning Example Search Best First Search

  43. Charniak Johnson Search Space Pruning Example Search Best First Search

  44. Charniak Johnson Search Space Pruning Example Search Best First Search

  45. Charniak Johnson Search Space Pruning Example Search Best First Search

  46. Charniak Johnson Search Space Pruning Example Search Best First Search

  47. Charniak Johnson Search Space Pruning Example Search Best First Search

  48. Charniak Johnson Search Space Pruning Example Search First Complete Parse!

  49. Charniak Johnson Search Space Pruning Example Search After the First Complete Parse Count edges expanded so far Then Expand Edge count x Pruning constant more edges Pruning constant = T parameter /10

  50. Charniak Johnson Search Space Pruning Example Search Expand edge count x Constant more edges

  51. Charniak Johnson Search Space Pruning Example Search Expand edge count x Constant more edges

  52. Charniak Johnson Search Space Pruning Example Search Expand edge count x Constant more edges

  53. Charniak Johnson Search Space Pruning Example Search Expand edge count x Constant more edges

  54. Charniak Johnson Search Space Pruning Pruning Effects on Accuracy 90 Labeled Attachment 85 80 F1 75 70 Default T210 T50 T10 T210 Minimal loss of accuracy for moderate pruning

Recommend


More recommend