See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/342834499 Visual presentation of mental healthcare chatbots for user experience Article in Journal of the HCI Society of Korea · May 2020 DOI: 10.17210/jhsk.2020.06.15.2.39 CITATIONS READS 0 59 2 authors , including: Seung Jin Chung Yonsei University 7 PUBLICATIONS 1 CITATION SEE PROFILE Some of the authors of this publication are also working on these related projects: Indirect Teaching for All & Autism spectrum disorder in Design Class(ITAD) View project User experience in VR art therapy View project All content following this page was uploaded by Seung Jin Chung on 10 July 2020. The user has requested enhancement of the downloaded file.
Visual presentation of mental healthcare chatbots for user experience Seung Jin Chung*, Hyunju Lee** Abstract Today the digital healthcare market and interests in mental health chatbots are growing. People get easily accessed to chatbot interactions, and chatbots can be used to support people’s emotional stability and psychological well-being. Consequently, many mental healthcare chatbots have been designed to reduce the likelihood of the occurrence of mental health issues by improving self-metacognition through early interventions. However, only few studies have discussed specific factors, which mental healthcare chatbot design may affect user experience. Besides, although visual presentations throughout a chatbot system influence users’ positive/negative experiences, most chatbot studies so far have focused on identity design rather than graphical interfaces and non-verbal visual communication tools. While it is important to examine specific visual design elements, it is also important to examine overall visual design requirements. Therefore, this study explored the user experience of mental health chatbots in terms of identity design, chatbot interface design, and visual communication tools. In this study, participants’ data were collected by pre/post preference evaluations, the system usability scale questionnaire, and semi-structured interview related to selected chatbot systems (i.e. Replika, Youper, Sayana, Woebot). The collected data were qualitatively analysed, and consequently, considerations were suggested for designing mental healthcare chatbots. 핵심어: Chatbot, Mental Healthcare, AI Bot, Chatbot Design, Emotion-Aware Conversation * 주저자 : PhD student of Dept. Human environment & Design, Yonsei University ** 교신저자 : Professor of Dept. Human environment & Design, Yonsei University; e-mail: hyunju@yonsei.ac.kr ■ 접수일 : 2020 년 2 월 10 일 / 심사일 : 2020 년 3 월 16 일 / 게재확정일 : 2020 년 5 월 15 일 39
visual-language based. Text-based chatbots are most commonly 1. Introduction found on mobile devices, but using over two types of interactions Mental health concerns are not only issues that affect particular is also frequent[5]. Text-based chatbots need to lead meaningful groups of people, but also anyone who lives in fast-paced society conversations according to contexts, and in the case of voice-based can suffer from mental difficulties such as burnout, chronic fatigue chatbots, various tones of voice are needed[6]. For empathic syndrome, depression, anxiety and problems sleeping on occasion. conversations, researchers have explored ways how chatbots can Although many of these difficulties are needed to address before mimic human emotions, adjust their speech patterns to express they cause significant disruption in people’s lives, social prejudice shared understanding, offer new perspectives, and account for the may cause sufferers from seeking treatment. Therefore, it is vital to situations and feelings communicated by users[7]. reduce social discrimination toward people with mental health Most of all, message interaction is the main feature in chatbot problems, and encouraging individuals is required how to cope with systems, researchers are considered that chat conversations have the the normal stresses of life by monitoring their mental well-being[1]. most influence on the user experience[8,9]. Go & Sundar[8] found People sometimes feel uncomfortable recognising their mental that message interactivity can compensate for low anthropomorphic concerns, and mental health applications can overcome this visualisation and identity cues. Also, in terms of mental healthcare discomfort as well as support mental health. Self-assessments and chatbot, building a trustworthy relationship is significant for suggested by chatbots can be effective to avoid self-stigma successful message interaction between a user and a chatbot[10]. concerns[2,3]. In addition, mental healthcare chatbots have been Therefore, strategies link to chatbot’s conversation styles, and developed based on basic psychological counselling principles. building an attachment bond between mental healthcare chatbots Psychological counselling generally begins by addressing the and users can be secured by relational cues such as small talk, cognition in the form of individual thoughts, emotions, and self-disclosure, empathy, humour, meta-relational talk, and behaviours. For example, cognitive behavioural therapy is one of continuity[11]. the representative treatments aimed to stop negative thinking or behaviour patterns by understanding the patient’s thoughts and feelings. Similarly, mental healthcare chatbots have been developed 2.2. Chatbot User Experience Evaluations to build trustworthy relationships with users. Evaluating the effectiveness of mental healthcare chatbots is Communications by appropriate visual presentations can help required a holistic approach. User experience (UX) can explore users recognise their emotions. Online environments are cognitive (pragmatic) and affective/hedonic factors including fundamentally visual, so visual design can be changed to increase objective and subjective appraisals before, during, and after digital user attention and improve their perception and decision-making interactions[12]. Followed by user need hierarchy, usability can be abilities[4]. However, despite the importance of visual design, the a fundamental factor to understand UXs as a higher-order need than scope has limited in chatbot research as chatbot identity design. the functionality need first[13]. However, usability assessments of Thus, this study aims to examine mental healthcare chatbot users’ chatbots include distinctive factors such as conversational experiences through overall visual presentations. For a intelligence, chatbot personality, and chat interface[14]. In this comprehensive discussion on the research objective, this study regard, Holmes et al.[2] developed a Chatbot Usability Questionnaire collected participants’ data and qualitatively analysed. Data were (CUQ) to assess the usability in that Shneiderman’s eight golden gathered by pre/post preference assessments, the system usability rules and Nielsen’s ten usability heuristics may not easily be scale (SUS) questionnaire, and 1:1 interview, and then examined adapted to studying chatbots. what/how visual presentations affected user experience. In order to understand UX, most researchers collected related quantitative and qualitative data, using several evaluating tools[15-18]. A longitudinal study by Winckler et al.[18] identified 2. Literature Reviews the following six UX dimensions based on the HCI literature: visual 2.1. Mental Healthcare Chatbots and aesthetic experience, emotion, stimulation, identification, meaning and value, and social relatedness and co-experience. They Conversational agents respond to users with natural language used several methods to assess UXs including thinking aloud, and empathic conversations are needed in mental healthcare Self-assessment Manikin (SAM) questionnaire, the AttrakDiff systems. Mental healthcare chatbots are categorised according to questionnaire, the SUS questionnaire, and semi-structured interviews. how users interact with them: text-based, voice-based, and 40
Recommend
More recommend