structural neural encoders for amr to text generation
play

Structural Neural Encoders for AMR-to-text Generation NAACL 2019 - PowerPoint PPT Presentation

Structural Neural Encoders for AMR-to-text Generation NAACL 2019 Marco Damonte, Shay Cohen School of Informatics, University of Edinburgh, UK 1 / 23 Abstract Meaning Representation (AMR) eat-01 :ARG1 :instrument :ARG0 pizza finger he


  1. Structural Neural Encoders for AMR-to-text Generation NAACL 2019 Marco Damonte, Shay Cohen School of Informatics, University of Edinburgh, UK 1 / 23

  2. Abstract Meaning Representation (AMR) eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of He ate the pizza with his fingers. 2 / 23

  3. AMR-to-text generation (English) eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of He ate the pizza with his fingers. 3 / 23

  4. AMR-to-text generation (English) eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of He ate the pizza with his fingers. 4 / 23

  5. Previous work eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of • Konstas et al. (2017): sequential encoder; • Song et al. (2018), Beck et al. (2018): graph encoder; 5 / 23

  6. This work He ate the pizza with his fingers. eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of • Are improvements in graph encoders due to reentrancies? • To answer, compare: 1 Sequence: BiLSTM; 2 Tree: TreeLSTM (Tai et al., 2015); 3 Graph: Graph Convolutional Network (GCN; Kipf and Welling, 2017). 6 / 23

  7. Sequential input (Konstas et al., 2017) eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he 7 / 23

  8. Sequential input (Konstas et al., 2017) BiLSTM eat-01 :arg0 :arg1 pizza :instr. finger part-of he he 8 / 23

  9. Sequential input (Konstas et al., 2017) eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he 9 / 23

  10. Tree-structured input eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he 10 / 23

  11. Tree-structured input BiLSTM TreeLSTM eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he eat-01 :arg0 :arg1 pizza :instr. finger part-of he he 11 / 23

  12. Tree-structured input BiLSTM GCN eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he eat-01 :arg0 :arg1 pizza :instr. finger part-of he he 12 / 23

  13. Tree-structured input eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he 13 / 23

  14. Graph-structured input eat-01 :ARG1 :instrument :ARG0 pizza finger he :part-of eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he 14 / 23

  15. Graph-structured input BiLSTM GCN eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he eat-01 :arg0 :arg1 pizza :instr. finger part-of he he 15 / 23

  16. Data • AMR R2: 39260 sentences • AMR R1: 19572 sentences (subset of R1) 16 / 23

  17. Comparison between models (dev set R1) 25 23 . 95 23 . 62 22 . 26 21 . 4 20 BLEU 15 10 Seq TreeLSTM GCN-Tree GCN-Graph 17 / 23

  18. Comparison with previous work (test set R1) 24 . 4 25 23 . 93 23 . 3 22 20 BLEU 15 10 GCN-Tree Konstas(seq) Song(graph) GCN-Graph Konstas: sequential baseline, Konstas et al. (2017) Song: graph encoder (GRN), Song et al. (2018) 18 / 23

  19. Comparison with previous work (test set R2) 24 . 54 25 23 . 62 23 . 3 20 BLEU 15 10 GCN-Tree Beck(graph) GCN-Graph Beck: graph encoder (GGNN), Beck et al. (2018) 19 / 23

  20. Reentrancies He ate the pizza with his fingers. eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he Model Number of reentrancies 0 1-5 6-20 (619) (679) (70) Seq 42.94 31.64 23.33 Tree +0.63 +1.41 +0.76 Graph +1.67 +1.54 +3.08 20 / 23

  21. Long-range dependencies He ate the pizza with a fork. eat-01 :arg0 he :arg1 pizza :instrument fork Model Max dependency length 0-10 11-50 51-200 (307) (297) (18) Seq 50.49 36.28 24.14 Tree -0.48 +1.66 +2.37 Graph +1.22 +2.05 +3.04 21 / 23

  22. Generation example tell-01 :ARG1 :ARG0 :ARG2 you person need-01 :ARG1 :ARG1 :ARG0-of go-06 have-rel-role-91 :ARG0 :path :ARG2 :time lawyer significant-other ex communicate-01 :mod all REF tell your ex that all communication needs to go through the lawyer Seq tell that all the communication go through lawyer Tree tell your ex , tell your ex , the need for all the communication Graph tell your ex the need to go through a lawyer 22 / 23

  23. Conclusions • Graph encoders based on GCN and BiLSTM gives best results for AMR-to-text generation; • Reentrancies and long-range dependencies contribute to the improvements of graph encoders; • Demo and source code: http://cohort.inf.ed.ac.uk/amrgen.html 23 / 23

  24. 1 / 8

  25. Do reentrancies help with generating pronouns? He ate the pizza with his fingers. eat-01 :arg0 he :arg1 pizza :instrument finger :part-of he Contrastive pair analysis (Sennrich, 2017): • Compute probability of a reference output sentence and the probability of a sentence containing a mistake; • Compute accuracy of model in assigning a higher probability to the reference sentence. 2 / 8

  26. Do reentrancies help with generating pronouns? He ate the pizza with his fingers → He ate the pizza with he fingers He ate the pizza with his fingers → He ate the pizza with him fingers He ate the pizza with his fingers → He ate the pizza with their fingers He ate the pizza with his fingers → He ate the pizza with her fingers Model Antecedent Type Num. Gender (251) (912) (1840) (95) Seq 96.02 97.70 94.89 94.74 Tree 96.02 96.38 93.70 92.63 Graph 96.02 96.49 95.11 95.79 3 / 8

  27. Input Model BLEU Meteor Seq Seq 21.40 22.00 SeqTreeLSTM 21.84 22.34 TreeLSTMSeq 22.26 22.87 TreeLSTM 22.07 22.57 Tree SeqGCN 21.84 22.21 GCNSeq 23.62 23.77 GCN 15.83 17.76 SeqGCN 22.06 22.18 Graph GCNSeq 23.95 24.00 GCN 15.94 17.76 4 / 8

  28. More examples � � � h ( k + 1 ) W ( k ) dir ( j , i ) h ( k ) + b ( k ) (1) = σ , i j j ∈N ( i ) 5 / 8

  29. More examples REF i dont tell him but he finds out . Seq i didn’t tell him but he was out . Tree i don’t tell him but found out . Graph i don’t tell him but he found out . 6 / 8

  30. More examples REF if you tell people they can help you , Seq if you tell him , you can help you ! Tree if you tell person_name you , you can help you . Graph if you tell them , you can help you . 7 / 8

  31. More examples REF i ’d recommend you go and see your doctor too. Seq i recommend you go to see your doctor who is going to see your doctor. Tree you recommend going to see your doctor too. Graph i recommend you going to see your doctor too. 8 / 8

Recommend


More recommend