learning to ask questions in open domain conversational
play

Learning to Ask Questions in Open-domain Conversational Systems with - PDF document

Learning to Ask Questions in Open-domain Conversational Systems with Typed Decoders Yansen Wang 1 , , Chenyi Liu 1 , , Minlie Huang 1 , , Liqiang Nie 2 1 Conversational AI group, AI Lab., Department of Computer Science, Tsinghua


  1. Learning to Ask Questions in Open-domain Conversational Systems with Typed Decoders Yansen Wang 1 , ∗ , Chenyi Liu 1 , ∗ , Minlie Huang 1 , † , Liqiang Nie 2 1 Conversational AI group, AI Lab., Department of Computer Science, Tsinghua University 1 Beijing National Research Center for Information Science and Technology, China 2 Shandong University ys-wang15@mails.tsinghua.edu.cn;liucy15@mails.tsinghua.edu.cn; aihuang@tsinghua.edu.cn;nieliqiang@gmail.com Abstract questions is one of the important proactive behav- iors that can drive dialogues to go deeper and fur- Asking good questions in large-scale, ther (Yu et al., 2016). open-domain conversational systems is Question generation (QG) in open-domain con- quite significant yet rather untouched. versational systems differs substantially from the This task, substantially different from tra- traditional QG tasks. The ultimate goal of this ditional question generation, requires to task is to enhance the interactiveness and persis- question not only with various patterns tence of human-machine interactions , while for but also on diverse and relevant topics. traditional QG tasks, seeking information through We observe that a good question is a nat- a generated question is the major purpose. The re- ural composition of interrogatives , topic sponse to a generated question will be supplied in words , and ordinary words . Interroga- the following conversations, which may be novel tives lexicalize the pattern of questioning, but not necessarily occur in the input as that in tra- topic words address the key information ditional QG (Du et al., 2017; Yuan et al., 2017; for topic transition in dialogue, and ordi- Tang et al., 2017; Wang et al., 2017; Mostafazadeh nary words play syntactical and grammat- et al., 2016). Thus, the purpose of this task is to ical roles in making a natural sentence. We spark novel yet related information to drive the in- devise two typed decoders ( soft typed de- teractions to continue. coder and hard typed decoder ) in which Due to the different purposes, this task is unique a type distribution over the three types is in two aspects: it requires to question not only in estimated and used to modulate the final various patterns but also about diverse yet rele- generation distribution. Extensive exper- vant topics. First , there are various questioning iments show that the typed decoders out- patterns for the same input, such as Yes-no ques- perform state-of-the-art baselines and can tions and Wh-questions with different interroga- generate more meaningful questions. tives. Diversified questioning patterns make di- alogue interactions richer and more flexible. In- 1 Introduction stead, traditional QG tasks can be roughly ad- Learning to ask questions (or, question generation) dressed by syntactic transformation (Andrenucci aims to generate a question to a given input. De- and Sneiders, 2005; Popowich and Winne, 2013), ciding what to ask and how is an indicator of ma- or implicitly modeled by neural models (Du et al., chine understanding (Mostafazadeh et al., 2016), 2017). In such tasks, the information questioned as demonstrated in machine comprehension (Du on is pre-specified and usually determines the pat- et al., 2017; Zhou et al., 2017b; Yuan et al., 2017) tern of questioning. For instance, asking Who- and question answering (Tang et al., 2017; Wang question for a given person, or Where-question for et al., 2017). Raising good questions is essen- a given location. tial to conversational systems because a good sys- Second , this task requires to address much more tem can well interact with users by asking and re- transitional topics of a given input, which is the sponding (Li et al., 2016). Furthermore, asking nature of conversational systems. For instance, for the input “I went to dinner with my friends” , we ∗ Authors contributed equally to this work. † Corresponding author: Minlie Huang. may question about topics such as friend, cuisine,

  2. price, place and taste . Thus, this task generally conversational systems. We analyze the key requires scene understanding to imagine and com- differences between this new task and other prehend a scenario (e.g., dining at a restaurant ) traditional question generation tasks. that can be interpreted by topics related to the in- • We devise soft and hard typed decoders to ask put. However, in traditional QG tasks, the core in- good questions by capturing different roles of formation to be questioned on is pre-specified and different word types. Such typed decoders rather static, and paraphrasing is more required. may be applicable to other generation tasks if word semantic types can be identified. 2 Related Work Traditional question generation can be seen in task-oriented dialogue system (Curto et al., 2012), sentence transformation (Vanderwende, 2008), machine comprehension (Du et al., 2017; Zhou et al., 2017b; Yuan et al., 2017; Subramanian et al., 2017), question answering (Qin, 2015; Tang et al., Figure 1: Good questions in conversational sys- 2017; Wang et al., 2017; Song et al., 2017), and tems are a natural composition of interrogatives, visual question answering (Mostafazadeh et al., topic words, and ordinary words. 2016). In such tasks, the answer is known and is part of the input to the generated question. Mean- Undoubtedly, asking good questions in conver- while, the generation tasks are not required to pre- sational systems needs to address the above is- dict additional topics since all the information has sues ( questioning with diversified patterns, and been provided in the input. They are applicable addressing transitional topics naturally in a gen- in scenarios such as designing questions for read- erated question ). As shown in Figure 1, a good ing comprehension (Du et al., 2017; Zhou et al., question is a natural composition of interrogatives, 2017a; Yuan et al., 2017), and justifying the visual topic words, and ordinary words. Interrogatives understanding by generating questions to a given indicate the pattern of questioning, topic words ad- image (video) (Mostafazadeh et al., 2016). dress the key information of topic transition, and In general, traditional QG tasks can be ad- ordinary words play syntactical and grammatical dressed by the heuristic rule-based reordering roles in making a natural sentence. methods (Andrenucci and Sneiders, 2005; Ali We thus classify the words in a question into et al., 2010; Heilman and Smith, 2010), slot- three types: interrogative , topic word , and or- filling with question templates (Popowich and dinary word automatically. We then devise two Winne, 2013; Chali and Golestanirad, 2016; Lab- decoders, Soft Typed Decoder (STD) and Hard utov et al., 2015), or implicitly modeled by recent Typed Decoder (HTD), for question generation in neural models(Du et al., 2017; Zhou et al., 2017b; conversational systems 1 . STD deals with word Yuan et al., 2017; Song et al., 2017; Subramanian types in a latent and implicit manner, while HTD et al., 2017). These tasks generally do not require in a more explicit way. At each decoding position, to generate a question with various patterns: for a we firstly estimate a type distribution over word given answer and a supporting text, the question types. STD applies a mixture of type-specific gen- type is usually decided by the input. eration distributions where type probabilities are Question generation in large-scale, open- the coefficients. By contrast, HTD reshapes the domain dialogue systems is relatively unexplored. type distribution by Gumbel-softmax and modu- Li et al. (2016) showed that asking questions in lates the generation distribution by type probabili- task-oriented dialogues can offer useful feedback ties. Our contributions are as follows: to facilitate learning through interactions. Several questioning mechanisms were devised with hand- • To the best of our knowledge, this is the first crafted templates, but unfortunately not applicable study on question generation in the setting of to open-domain conversational systems. Similar to our goal, a visual QG task is proposed to gener- 1 To simplify the task, as a preliminary research, we con- ate a question to interact with other people, given sider the one-round conversational system.

Recommend


More recommend