cmsc 20370 30370 winter 2020 evaluation qualitative
play

CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case - PowerPoint PPT Presentation

CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case Study: Underserved Users Jan 15, 2020 Quiz Time (5-7 minutes). Quiz on DreamGigs Principles of Good Design Administrivia GP0 due on Friday GP project website space


  1. CMSC 20370/30370 Winter 2020 Evaluation – Qualitative Methods Case Study: Underserved Users Jan 15, 2020

  2. Quiz Time (5-7 minutes). Quiz on DreamGigs Principles of Good Design

  3. Administrivia • GP0 due on Friday • GP project website space – Can host on personal web space o ff ered by department – Once we know all the groups, we will start scheduling the project presentations and specify the allotted time per presentation – You are expected to attend all project group presentation days • IA 2 – a paired assignment is due next Friday – This is a design critique

  4. Today’s Agenda • Evaluating your design/prototype/ system – Usability testing – Inspection methods – Qualitative techniques

  5. USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE

  6. USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE

  7. USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE

  8. Case Study: DreamGigs • Underserved job seekers • Based on interviews and speed dating study of 10 initial design concepts • Created and evaluated 3 versions of Dreamgigs – Use usability tests – Semi-structured interviews with job seekers and social workers • Use HCI empowerment framework to see how prototype facilitates job seekers empowerment

  9. What do underserved job seekers need ?

  10. How do job seekers react to DreamGigs?

  11. Does DreamGigs empower job seekers?

  12. These questions require evaluation

  13. TWO KINDS OF EVALUATION IN UCD DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE

  14. TWO KINDS OF EVALUATION IN UCD FORMATIVE EVALUATION DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE

  15. TWO KINDS OF EVALUATION IN UCD DESIGN/PROTOTYPE SUMMATIVE EVALUATION IMPLEMENT USER NEEDS EVALUATE

  16. Two kinds of evaluation • Formative – Helps us understand the problem and our users to inform our design • Summative – Helps us understand how well our design works and how to re fi ne it

  17. Formative evaluation • What are the top usability issues? • What works well? • What does not work as well? • What are common errors? • Is the design improving over time?

  18. Summative • How does our design compare to similar products? • Is this design better than before? • How usable is this design? • What could be improved in this design?

  19. What will evaluation tell us? • Will the user like the product? • Is this product more e ffi cient than past products? • How does this product compare to others? • What are the most pressing usability issues with this product? • What is the user experience when using the product overall? (Think of unboxing)

  20. Evaluation Planning • Determine the goals . • Explore the questions . • Choose the evaluation methods . • Identify the practical issues . • Decide how to deal with the ethical issues. • Evaluate, analyze, interpret and present the data .

  21. Case Study: DreamGigs • Any ethical issues here in terms of evaluation?

  22. Example Planning Questions • Why are we evaluating the design? • Who are the users/participants? • How many participants are needed? • What is the budget/timeline/resources? • What evaluation technique should we use? • What kind of data should we collect?

  23. How do you collect data? • Observe users • Ask users for their opinions – interviews/surveys/think aloud/log usage • Ask experts for their opinions • Test users performance

  24. So how do we evaluate our designs?

  25. First we need metrics for evaluation

  26. Usability metrics • No of keystrokes • Time to complete a task • User satisfaction • Error rates • Physiological measures • No of mouse clicks etc A usability metric reveals something about the interac@on between the user and the system

  27. What are guiding principles for usability metrics? • E ff ectiveness – How well can the user complete the task? • E ffi ciency – What is the amount of e ff ort to complete the task? • Satisfaction – How satis fi ed/dissatis fi ed was the user while completing the task • Safety • Learnability • Memorability

  28. Case Study: DreamGigs • Initially performed a usability test with 5 social workers and used a “think aloud” study • This means they asked users to use the storyboard and say what they are thinking at each step • Need to capture all this data • Also usually use a post-study questionnaire • Is this formative or summative? • What metrics could they have used?

  29. Basic Usability/Lab Study

  30. What are user experience metrics? • Pleasurable • Rewarding • Fun • Provocative • Empowering • Enlightening

  31. Case Study: DreamGigs • Not just whether the tool will serve a purpose but also about… • How participants felt using it – E.g. seeing that you can explore jobs that you did not think of for your skill set – “expanding your horizons” • Used HCI empowerment framework for this part

  32. Other Evaluation Methods • Inspection based methods – Based on skills and expertise of evaluators (no users) • Empirical methods – Test with real users

  33. Inspection Methods • Known as Expert Review/discount usability methods 1. Heuristic evaluation (developed by Jakob Nielsen) 2. Cognitive Walkthroughs

  34. Heuristic evaluation Assess interface based on predetermined • criteria Have small set of evaluators examine the • interface and judge compliance with recognized usability principles Di ff erent evaluators fi nd di ff erent problems • Aggregate fi ndings • Use fi ndings to fi x issues/redesign • Can be used throughout user-centered design • process

  35. Cognitive Walkthrough • Put yourself in the shoes of the user • Construct carefully designed tasks to perform on system spec/mockups • Walk through activities required to go from one screen to another (cognitive and operational) • Review actions needed for each task • Attempt to predict how users will behave and what they will encounter

  36. Inspection Methods • Pros? – Faster • HE is 1-2 hours per evaluator vs days/weeks – HE + CW do not require interpreting user data – Better than no evaluation • Cons? – HE may miss problems + misidentify issues – User testing more accurate

  37. Field Study Example

  38. Case Study: DreamGigs • Field study? – Pros? – Cons?

  39. Other Popular Techniques

  40. Case Study: DreamGigs • How else could they have evaluated DreamGigs? • What limitations are there to the work as it is presented? • Any other questions on DreamGigs?

  41. Summary • Evaluation is a key part of user-centered design • Type depends on goals and system being tested • Choose depending on resources and what you want feedback on • Inspection methods do not require as many resources as user testing • They are “discount” usability methods but not without limitations • Field studies require a lot more work and use surveys, interviews, think alouds, and logs to gather data

  42. Coming up next class • Project team discussions • Come to class – Ensure that your group checks in with one of the TAs on your project progress – TAs have a short checkpoint form – Q&A with TAs • Turn in GP0

  43. Get in touch: O ffi ce hours: Fridays 2-4pm (Sign up in advance) or by appointment JCL 355 Email: marshini@uchicago.edu

Recommend


More recommend