argument level interactions for persuasion comments
play

Argument-Level Interactions for Persuasion Comments Evaluation using - PowerPoint PPT Presentation

Argument-Level Interactions for Persuasion Comments Evaluation using Co-attention Lu Ji , Zhongyu Wei, Xiangkun Hu, Yang Liu , Qi Zhang, Xuanjing Huang COLING2018 2018.7.25 1


  1. Argument-Level Interactions for Persuasion Comments Evaluation using Co-attention Lu Ji , Zhongyu Wei, Xiangkun Hu, Yang Liu , Qi Zhang, Xuanjing Huang 自然语言处理前沿技术研讨会暨 COLING2018 论文预讲会 2018.7.25

  2. 1 Introduction and Motivation 2 Our Approach Outline 3 Experiments 4 Analysis on Co-attention Network 5 Conclusion

  3. Introduction and Motivation Computational argumentation: l argument unit detection (Al-Khatib et al., 2016) l argument structure prediction (Peldszus and Stede, 2015) l argumentation scheme classification (Feng et al., 2014) … Online Debating Forums http://convinceme.net. http://debatepedia.idebate.org. https://reddit.com/r/changemyview

  4. Introduction and Motivation Change My View(CMV) on Reddit.com

  5. Introduction and Motivation Related Work n Winning Arguments: Interaction Dynamics and Persuasion Strategies in Good-faith Online Discussions (Tan et al.,2016) n Is this post persuasive? ranking argumentative comments in online forum (Wei and Liu, 2016) Common grounds : Ø They construct datasets from the ChangeMyView subreddit. Ø Labels are given by the online debating forum. Ø They design interaction based features and argument-only feature on word level to judge whether a post is persuasive.

  6. Introduction and Motivation Related Work n What makes a convincing argument ? empirical analysis and detecting attributes of convincingness in web argumentation (Habernal and Gurevych,2016) n Why can’t you convince me? modeling weaknesses in unpersuasive arguments (Persing and Ng,2017) Common grounds : Ø They construct datasets from online debating forums. Ø They both annotate a corpus of debate comments. Ø They design argumentation based feature on word level to analyze the persuasiveness of arguments.

  7. Introduction and Motivation Related Work l Argument related features are considered in word-level. l Interaction between two participants are largely ignored. l Our work is to represent the debate comments in units of arguments and to study their interaction on argument level. l The stronger the interaction, the stronger the persuasiveness of texts.

  8. Introduction and Motivation Original Post: Philosophy doesn’t seem to have any practical applications. [What value does philosophy have in the modern age, right row, aside from contemplating thing?] I have read the argument that it is impossible to argue that philosophy is useless without using philosophy. [What do you gain from studying philosophy that could not be gained from thoughtful introspection?] Positive Reply Negative Reply What do you gain from studying philosophy that What do you gain from studying could not be gained from thoughtful introspection? philosophy that could not be gained [Two answers. #1 rigor and #2 it saves us from from thoughtful introspection? [Ask reinventing the wheel.] [Why do you think we yourself the same question about should start from scratch in all value decisions math.] Your argument seems to be rather than seeking to understand the work that has that studying philosophy is a waste been done in the past?] of time because it has no practical use. Figure : An example of dialogical argumentation.

  9. 1 Introduction and Motivation 2 Our Approach Outline 3 Experiments 4 Analysis on Co-attention Network 5 Conclusion

  10. System Framework Overall architecture of the proposed model.

  11. System Framework S(OP,R) Output Dense Layer * X O Attention feat Pooling Aggregation Network GRU GRU GRU GRU * n {U} i i=1 Co-Attention OP n R m {r } {r } Co-Attention Network Network i i=1 j j=1 Attention OP u Pooling GRU GRU GRU GRU GRU GRU GRU Argument Vector Original Post Reply Overall architecture of the proposed model.

  12. System Framework S(OP,R) Output Dense Layer * X O Attention feat Pooling Aggregation Network GRU GRU GRU GRU * n {U} i i=1 Co-Attention OP n R m {r } {r } Co-Attention Network Network i i=1 j j=1 Attention OP u Pooling GRU GRU GRU GRU GRU GRU GRU Argument Vector Original Post Reply Overall architecture of the proposed model.

  13. System Framework Co-Attention Network Reply argument to post argument attention Post argument to reply argument attention ... Softmax R R R r r r 1 2 m ... R R R r r r O P r 1 2 m 1 O P Softmax r O P r 1 2 O P r ... 2 ... O P r n O P r n Alignment Matrix Alignment Matrix Post to reply argument attention + Weights ... R R R r O P r r u 2 m 1 Original Post Reply The detailed structure of the co-attention network

  14. System Framework l Reply argument to post argument attention computes the relevance of each reply argument with every post argument and obtains a set of new post representations. l Post argument to reply argument attention computes the relevance of each post argument with every reply argument and helps learn a set of new reply representations. l Post to reply argument attention computes the relevance of each reply argument with the entire post argument which contributes to learn a new reply representation.

  15. System Framework Co-Attention Network Reply argument to post argument attention Post argument to reply argument attention ... Softmax R R R r r r 1 2 m ... R R R r r r O P r 1 2 m 1 O P Softmax r O P r 1 2 O P r ... 2 ... O P r n O P r n Alignment Matrix Alignment Matrix Post to reply argument attention + Weights ... R R R r O P r r u 2 m 1 Original Post Reply The detailed structure of the co-attention network

  16. System Framework Co-Attention Network Reply argument to post argument attention Post argument to reply argument attention ... Softmax R R R r r r 1 2 m ... R R R r r r O P r 1 2 m 1 O P Softmax r O P r 1 2 O P r ... 2 ... O P r n O P r n Alignment Matrix Alignment Matrix Post to reply argument attention + Weights ... R R R r O P r r u 2 m 1 Original Post Reply The detailed structure of the co-attention network

  17. System Framework Co-Attention Network Reply argument to post argument attention Post argument to reply argument attention ... Softmax R R R r r r 1 2 m ... R R R r r r O P r 1 2 m 1 O P Softmax r O P r 1 2 O P r ... 2 ... O P r n O P r n Alignment Matrix Alignment Matrix Post to reply argument attention + Weights ... R R R r O P r r u 2 m 1 Original Post Reply The detailed structure of the co-attention network

  18. System Framework Overall architecture of the proposed model.

  19. System Framework Overall architecture of the proposed model.

  20. System Framework Overall architecture of the proposed model.

  21. System Framework Overall architecture of the proposed model.

  22. 1 Introduction and Motivation 2 Our Approach Outline 3 Experiments 4 Analysis on Co-attention Network 5 Conclusion

  23. Experiments Dataset : l /r/ChangeMyView subreddit l 3,456 training instances and 807 testing instances (Tan et al.,2016) Training Set Test Set Ave w Var w Ave p Var p Ave w Var w Ave p Var p Original post 10 49.5 14 163.7 11 53.2 15 133.7 Positive reply 10 46.3 14 125.0 10 44.1 13 123.8 Negative reply 10 39.2 11 82.0 10 44.7 10 69.5 Table: The statistics of the datasets used in our experiments. • Ave w represents the average number of words per argument. • Ave p represents the average number of arguments per post. • Var w indicates the variance of the number of words per argument. • Var p indicates the variance of the number of arguments per post.

  24. Experiments Models for Comparing : Ø Tan et al. (2016) designed interplay features, argument-related features and text style features to predict whether a reply is persuasive. Ø WB employs BiGRU to encode posts on word level. Ø CB employs CNN+BiGRU to encode posts on argument level. Ø WOF uses the word-overlap features to evaluate argumentation quality. Ø CBCA introduces the co-attention network to the model CB. Ø CBWOF introduces the word-overlap features to the model CB. Ø CBAWOF_I is with the post argument to reply argument attention in co-attention network. Ø CBAWOF_II is with the reply argument to post argument attention in co-attention network. Ø CBAWOF_III is with the post to reply argument attention in the co-attention network. Ø CBCAWOF is our proposed model.

  25. Experiments Model Pair accuracy Tan et al.(2016) 65.70 Word-level BiGRU(WB) 61.22 CNN+BiGRU(CB) 63.34 Word Overlap Features(WOF) 63.59 CNN+BiGRU+Co-Att (CBCA) 66.96 CNN+BiGRU+Word Overlap Features(CBWOF) 68.08 CNN+BiGRU+Att_III+Word Overlap Features(CBWOF_III) 69.95 CNN+BiGRU+Att_I+Word Overlap Features(CBWOF_I) 70.07 CNN+BiGRU+Att_II+Word Overlap Features(CBWOF_II) 70.20 CNN+BiGRU+Co-Att+Word Overlap Features(CBCAWOF) 70.45* Table: The performance of different approaches on our datasets. Bold : best performance; underline: performance of the state-of-the-art method; * : significantly better than Tan et al.,2016 (p < 0.01).

Recommend


More recommend