explaining smart systems to encourage or discourage user
play

Explaining smart systems to encourage (or discourage?) user - PowerPoint PPT Presentation

Explaining smart systems to encourage (or discourage?) user interaction Dr Simone Stumpf Centre for HCI Design Simone.Stumpf.1@city.ac.uk @DrSimoneStumpf Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 2 Bio End-user interactions with AI


  1. Explaining smart systems to encourage (or discourage?) user interaction Dr Simone Stumpf Centre for HCI Design Simone.Stumpf.1@city.ac.uk @DrSimoneStumpf

  2. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 2 Bio § End-user interactions with AI – explanations, user experience and trust § Projects § Researcher on DARPA-funded ”CALO” project to use machine learning system to track and suggest appropriate task-based resources § Co-I on NSF-funded “End-user debugging of machine-learned programs“ project § PI for the “FREEDOM” project on smart heating system UIs § Co-I on EPSRC ”SCAMPI” project to develop a smart home technology for self-managing quality of life plans for people with dementia and Parkinson’s

  3. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 3 AI?

  4. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 4 The old AI is not the new AI § “Old” AI worked mainly on rule-based inferences that were “extracted” by a knowledge engineer e.g. expert systems § “New” AI is typically based on machine learning using complex statistical inferences e.g. SVMs, deep learning § Usually system learns a function or weights from (large) data sets so it can provide some appropriate output

  5. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 5 HCI issues for AI User Experience Intelligibility Controllability § “Black boxes” don’t communicate how they work § “Magic” to users so have very poor mental models of what is going on § What happens if AI goes wrong?

  6. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 6 Explanatory debugging [Kulesza et al. 2015] Improved mental model § Debugging is trying to identify and correct mistakes in a system’s program = controllability § Explanation-centric Explanation Feedback approach to help end users effectively and efficiently personalize machine learning systems = intelligibility § Note: explanations can be Improved behaviour words or pictures

  7. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 7 Principles of explanatory debugging § Explainability § Be iterative : small, consumable bites as users interact § Be sound : truthful to how the system works to build trust § Be complete : include as much as possible what it uses § But don’t overwhelm : tradeoff between soundness, completeness and attention § Correctability § Be actionable : users can modify the system based on the explanations § Be reversible : users may make it worse and need a way to back out § Always honour feedback : don’t disregard what the user tells the system § Incremental changes matter : need to build up system changes iteratively

  8. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 8 Example system: EluciDebug

  9. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 9 Explanability and correctability

  10. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 10 Evaluation study § 77 participants split into two groups: 40 using EluciDebug, 37 using a version without explanations and advanced feedback § 20 Newsgroup data set (Hockey and Baseball): initial system training on 5 messages for each subject, 1850 unlabeled messages to sort § 30 minutes to “make the system as accurate as possible” § Measures: accuracy, feedback given, mental model scores, perceived workload

  11. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 11 Results § Better accuracy § 85% for EluciDebug versus 77% at end of study § With less feedback § On average, they added 34.5 new features, removed 8.2 features, made 18.3 feature adjustment and interacted with 47 messages (while control users had to label an average of 182 messages) § No difference in workload § With better understanding § 15.8 mental model score versus 10.4 § Correlation between mental model score and system accuracy at study end (i.e. the more you understand the better you can make it)

  12. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 12 Not all systems are the same § Not all systems should demand high user interactions § “Calm technology” [Weiser and Brown 1997] § “Constrained engagement” between the system and user [Yang and Newman 2013] § Not all explanations increase users correcting the system [Bussone et al. 2015] § Provide explanations to discourage user interactions?

  13. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 13 Traditional heating systems user controls heat emits heat makes heat

  14. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 14 smart heating systems system controls heat emits heat makes heat

  15. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 15 How to discourage user interactions § Study on what users want to know [Skrebe and Stumpf 2017] § ”Unexpected behaviour” e.g. preheating, demand response, overshooting temp, etc § Detailed reasons for decisions and benefits § Notify before action is taken (ideally) for added control § Design of UI Next: IN period starts 18:15 IN Now 16.5˚ Now 1 16.5˚ Electricity – Indoors heat pump indoors § Text first, graph on demand Preheating to 19˚ 7 Phone is forced landscape Preheating 8 System on, Heat pump on System off Heat switched Heat switched Comfort & Savings § Explanations fit with Explanatory  £ Comfort & warming home low power to gas boiler to heat pump settings in e ff ect Savings Temp. outdoors £ £ 20   20 So you'll be comfortable <in the morning>, your home is Temperature indoors 18 18 Pre- preheating. 2 Temp. indoors 16 16 heating Preheating 11 14 IN 14 Debugging (mostly) Pre- IN It will take <X> hrs <Y> mins to reach <19˚> by <18:45>, 12 06:45 heating 12 9 Preheating 10 – 18:15 – 21:45 10 based on: 09:45 8 8 ASLEEP ASLEEP • The indoor and outdoor temperatures 3 19˚ IN 6 21:45 – 6 19˚ Occupancy period • How well your home holds its heat 06:45 OUT 09:45 – 18:15 8˚ 21:45 – 06.45 4 4 Temperature • Your Comfort & Savings settings 10 2 12˚ 2 outdoors 12˚ 0 Primary heat 0 Heat source More expensive Electricity – heat pump Gas – boiler source <Plus, your electricity is currently at a lower rate, so 12 cost Less expensive efficiency preheating now is better value for you.> Heat source 05:00 08:00 11:00 14:00 20:00 23:00 02:00 05:00 cost e ffi ciency 17:00 Preheating will continue until the start of the next IN period. 6 Electricity high demand period 4 5 6 Edit schedule OK Show graph Remove message Edit schedule Screen 1.2

  16. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 16 Explanation evaluation study § 60 participants with simulated home heating scenarios § 4 conditions § Control: no explanation, normal heating UI only 
 § Text-only: written explanation given, no graphical element 
 § Graphical-only: graphical explanation given, no written element 
 § Both: full explanation given, consisting of both graphical and written elements § Explanations increased understanding § Explanations did not increase trust in the system § Without explanations were more concerned about predictability of system actions § With any of explanations more focused on whether system was doing the right thing

  17. Simone.Stumpf.1@city.ac.uk | @DrSimoneStumpf 17 Future work § Evaluation of smart heating UIs in field trial § Publication on FREEDOM designs and evaluations § Workshop on explainable smart systems (ExSS) at IUI 2018 § Research and dissemination of SCAMPI project results Big questions: How to design with AI as a new material? How can we craft better explanations for use in different AI systems? What are the impacts of explanations and user interactions with AI?

Recommend


More recommend