towards personalized review summarization via user aware
play

Towards Personalized Review Summarization via User-Aware Sequence - PowerPoint PPT Presentation

Towards Personalized Review Summarization via User-Aware Sequence Network ADVISOR: JIA-LING, KOH SOURCE: AAAI-19 SPEAKER: WEI, LAI DATE: 2020/5/25 OUTLINE 01 03 INTRODUCTION EXPERIMENTS 02 04 METHOD CONCLUSION 2 4 User-aware


  1. Towards Personalized Review Summarization via User-Aware Sequence Network ADVISOR: JIA-LING, KOH SOURCE: AAAI-19 SPEAKER: WEI, LAI DATE: 2020/5/25

  2. OUTLINE 01 03 INTRODUCTION EXPERIMENTS 02 04 METHOD CONCLUSION 2

  3. 4

  4. User-aware Sequence Network (USN) Review summarization aims to generate a condensed summary for a review or multiple reviews. INTRODUCTION Different users may: Care about different contents ● Have their own writing styles ● 3

  5. METHOD User-aware Sequence Network (USN) 5

  6. Problem Formulation ( ) User - Review - Summary * m 𝑣 𝑦 𝑧 {𝑧 ! , 𝑧 ",… , 𝑧 & } {𝑦 ! , 𝑦 ",… , 𝑦 % } 𝑾 𝒕 𝑾 𝒖 Source vocabulary Target vocabulary (#:30,000) (#:30, 000) 6

  7. Get To The Point: Summarization with Pointer-Generator Networks Sequence-to-sequence attentional model 7

  8. 8

  9. User-aware Encoder encoder hidden state (dim=256) single-layer review embedding (dim=128) 9

  10. User-aware Encoder 𝑕𝑏𝑢𝑓 # = 𝜏(𝑋 $ ℎ # ; 𝑣 + 𝑐 $ ) ℎ′ # = ℎ # ⊙ 𝑕𝑏𝑢𝑓 # User Selection Strategy (dim=256) (dim=128) Encoder hidden state User embedding 10

  11. Context vector 11

  12. Context Vector 𝑓 %,# = 𝑊 ' tanh(𝑋 ( ℎ′ # + 𝑋 )( 𝑡 % + 𝑐 ( ) exp(𝑓 %,# ) Decoder hidden state 𝑏 %,# = ∑ # exp(𝑓 %,# ) (dim=256) 𝑑′ % = = 𝛽 %,# ℎ′ # # 12

  13. Vocabulary state 13

  14. Vocabulary State 𝑕 %,$ = 𝑊 ' tanh(𝑋 * 𝑉 $ + 𝑋 )* 𝑡 % + 𝑐 * ) exp(𝑕 %,$ ) Decoder hidden state 𝑻 𝒖 𝛾 %,$ = ∑ $ exp(𝑕 %,$ ) 𝑛 % = = 𝛾 %,$ 𝑉 $ User-specific vocabulary memory $ (K most user-specific words) 14

  15. Vocabulary distribution 15

  16. Vocabulary Distribution - 𝑠 )% + 𝑐 - ) 𝑄′ ,-. = 𝑡𝑝𝑔𝑢𝑛𝑏𝑦(𝑋 𝑻 𝒖 𝑻 𝒖 +) 𝑑 )% ; 𝑡 % ; 𝑣; 𝑛 % + 𝑐 +) 𝑠′ % = 𝑋 Readout state User Prediction Strategy User Memory Prediction Strategy 16

  17. Final distribution 17

  18. Get To The Point: Summarization with Pointer-Generator Networks Pointer-generator Model 18

  19. Vocabulary Distribution */ [𝑑 )% ; 𝑡 % ; 𝑛 % ] + 𝑐 */ ) 𝑄 */0 = 𝜏(𝑋 User Memory Generation Strategy 𝑄 𝑥 = 1 − 𝑞 */0 𝑄 ),-. 𝑥 + 𝑞 */0 = 𝛾 %,$ $:2 ' 34 19

  20. Loss 6 𝑀 = − 1 𝑚 = 𝑚𝑝𝑕𝑄 ,-. (𝑧 % ) %35 20

  21. Training: 536255-(5000+5000) = 526255 Validation: 5000 Test: 5000 EXPERIMENTS Data 21

  22. Comparison Methods Lead-1 : extractive the first sentence ● LexRank : extractive approach based on PageRank algorithm ● S2S+Att : abstractive model ● SEASS : selective network + S2S + Att in sentence summarization ● PGN : copy mechanism + S2S + Att in document summarization ● 22

  23. EXPERIMENTS Results 23

  24. EXPERIMENTS Aspect-level Coverage ho hote tel 24

  25. EXPERIMENTS Aspect-level Coverage 25

  26. 1. User selection EXPERIMENTS 2. User prediction 3. User memory prediction Different User-based Strategies 4. User memory generation 26

  27. EXPERIMENTS User-specific vocabulary size 27

  28. EXPERIMENTS User-based Selective Gate Visualization Output summary: excellent service, comfy be • True Summary: excellent service, very comfortable bed • 28

  29. EXPERIMENTS Case study 29

  30. Conclusion • Propose a User-aware Sequence Network (USN) to consider user information into personalized review summarization. • Propose 4 user-based strategies. • Construct a new dataset (Trip). • Future work: extend model to the multi-review scenario. 30

Recommend


More recommend