Affect-sensitive Dialogue Response Generation for Positive Emotion Elicitation Nuru Nurul Fithri ria Lubi ubis Augm ugmen ented ed Human Com ommunication on (AHC) Lab Na Nara Inst nstitute e of Sci cien ence e and nd Technol olog ogy (NAI NAIST ST)
Affective dialogue systems • Increase of dialogue system works and applications in various tasks involving affect High potential of dialogue • Companion for the elderly systems to address the [Miehle et al., 2017] • emotional needs of users Distress clues assessment [DeVault et al., 2014] • Affect-sensitive tutoring [Forbes-Riley and Litman, 2012] 15 March 2019 Nurul Lubis 2
Emotion elicitation Traditional emotion works Intent for emotion elicitation Expression Recognition Expression Emo motio ion elic licit itatio ion: elic licit itin ing change e of of emotio ion in dialo logue • Using machine translation with target emotion (Hasegawa et al., 2013) • Using system’s affective personalities ( Skowron et al., 2013) Have not yet considered the emotio ional l benefit it for the user 15 March 2019 Nurul Lubis 3
Research goal: Positive emotion elicitation We aim to draw on an overlooked potential of emotion elicitation to improve us user emotional st states • A chat-based dialogue system with an implicit goal of po posi siti tive emotion elicitation Circumplex model of affect [Russell, 1980] 15 March 2019 Nurul Lubis 4
Different responses elicit different emotions I failed the test. User I failed the test. I failed the test. System Sys Oh, Oh, aga gain? You will do do be better er nex next time! e! User Yeah… Thank you. arousal arousal valence valence Negative Positive Emotiona nal impa pact 15 March 2019 Nurul Lubis 5
Positive emotion elicitation does NOT mean always responding with positive emotion How was your day? There are situations where Terrible. Work did not go well. “happy responses” can lead to negative impact I’m glad to hear that! Expressing negative emotion can lead to How was your day? positive impact • System should learn the Terrible. Work did not go well. proper strategy That’s too bad. 13 December 2018 Nurul Lubis 6
Neural chat-based dialogue system • End-to-end modeling of chat dialogue • RNN encoder-decoder [Vinyals et al., 2015] • Hierarchical recurrent encoder-decoder (HRED) [Serban et al., 2016] • Generating dialogue response with emotional expression [Zhou et al., 2018] Not yet an application towards [Serban et al., 2016] emotion elicitation 15 March 2019 Nurul Lubis 7
Proposed approach 15 March 2019 Nurul Lubis 8
Emotion-sensitive response generation: Emo-HRED Encodes emotional context and considers it in generating a response Train on responses that elicit positive emotion 15 March 2019 Nurul Lubis 9
Training Emo-HRED Optimization Opti on Pre Pre-training g and nd selec ective fine-tuning • Emotion-annotated data is limited • Train on combined losses • Start by pr pre-training HRED with large- • Negative Log Likelihood (NLL) of target scale conversational data response • Learning semantic and syntactic • Emotion prediction error knowledge • The emotion encoder targets the • Select emotion label of the dialogue turn ctivel ely fine-tune Emo-HRED with • The final cost is used to optimize the the emotion-annotated data • Only train parts that are affected by entire network emotion context • Adam optimizer • Avoid over-fitting or destabilizing 15 March 2019 Nurul Lubis 10
Datasets Existing da data Posi sitive-emotion on eliciting da data dialogue default system SEMAINE dataset system response SubTle Database Database • For pre-training positive- • Large-scale conversational corpus human emotion eliciting judgement from movie subtitles (5.5M triples) dataset SEMAINE Database SEMAINE-positive • Small amount of conversation • For fine-tuning between user and listening agent in • Augmenting an existing corpus WoZ fashion (2K triples) • Contains positive-emotion eliciting responses 15 March 2019 Nurul Lubis 11
Evaluation 15 March 2019 Nurul Lubis 12
Objective evaluation: Perplexity Pre Pre-training: g: SubTle Fine-tuning: SEMAINE-positive Testing: SEMAINE-positive Perp rplexity Parameter er Model on SEMAINE- update upda posit itiv ive test set standard 121.44 Baseline HRED selective 100.94 Proposed Emo-HRED selective 42.26 26 Emotion information can be leveraged in response generation to reduce perplexity 15 March 2019 Nurul Lubis 13
Subjective evaluation 3.5 • Evaluation via crowdsourcing 3.4 • 100 test queries, 20 judgments each 3.3 • Likert scale 1 to 5 (higher is better) Likert score • Naturalness 3.2 • Positive emotional impact 3.1 The proposed model is perceived as 3.26 3.27 3.22 3.39 3 more natural and significantly elicits a naturalness positive emotional more positive emotion ( 𝑞 < 0.05 ) impact HRED positive Emo-HRED positive 15 March 2019 Nurul Lubis 14
Conclusion 15 March 2019 Nurul Lubis 15
Conclusion • We proposed: a dialogue response generator to elicit positive emotion • Considers emotional context of dialogue • Trained on constructed corpus that contains responses with positive emotional impact • Subjective and objective evaluation show improvement over system that ignores emotion information • More natural • Elicit a more positive emotion impact • Future work • Collect and utilize more emotion rich dialogue data • Richer dialogue context • Other modalities • Longer context 15 March 2019 Nurul Lubis 16
Thank you 15 March 2019 Nurul Lubis 17
Automatically retrieve responses with positive impact by utilizing example-based dialogue system approach (Lubis et al., 2017) in Proc. IWSDS 2017 • Semantic similarity: text cosine similarity between Example query and example query Database text emotional change emotion • Emotion correlation: valence & arousal Pearson’s score Semantic Emotion Emotional between query and example 10-best 3-best Query similarity correlatio impact query scoring n scoring scoring • Emotional impact: valence change in the example triple best best Traditional EBDM Proposed EBDM Evaluation shows that the proposed EBDM is perceived as Response Response more natur ural and elicit a more positive impact 25-Mar-19 Nurul Lubis 18
Recommend
More recommend