CMSC 20370/30370 Winter 2020 Evaluation – Qualitative Methods Case Study: Underserved Users Jan 15, 2020
Quiz Time (5-7 minutes). Quiz on DreamGigs Principles of Good Design
Administrivia • GP0 due on Friday • GP project website space – Can host on personal web space o ff ered by department – Once we know all the groups, we will start scheduling the project presentations and specify the allotted time per presentation – You are expected to attend all project group presentation days • IA 2 – a paired assignment is due next Friday – This is a design critique
Today’s Agenda • Evaluating your design/prototype/ system – Usability testing – Inspection methods – Qualitative techniques
USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE
USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE
USER-CENTERED DESIGN DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE
Case Study: DreamGigs • Underserved job seekers • Based on interviews and speed dating study of 10 initial design concepts • Created and evaluated 3 versions of Dreamgigs – Use usability tests – Semi-structured interviews with job seekers and social workers • Use HCI empowerment framework to see how prototype facilitates job seekers empowerment
What do underserved job seekers need ?
How do job seekers react to DreamGigs?
Does DreamGigs empower job seekers?
These questions require evaluation
TWO KINDS OF EVALUATION IN UCD DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE
TWO KINDS OF EVALUATION IN UCD FORMATIVE EVALUATION DESIGN/PROTOTYPE IMPLEMENT USER NEEDS EVALUATE
TWO KINDS OF EVALUATION IN UCD DESIGN/PROTOTYPE SUMMATIVE EVALUATION IMPLEMENT USER NEEDS EVALUATE
Two kinds of evaluation • Formative – Helps us understand the problem and our users to inform our design • Summative – Helps us understand how well our design works and how to re fi ne it
Formative evaluation • What are the top usability issues? • What works well? • What does not work as well? • What are common errors? • Is the design improving over time?
Summative • How does our design compare to similar products? • Is this design better than before? • How usable is this design? • What could be improved in this design?
What will evaluation tell us? • Will the user like the product? • Is this product more e ffi cient than past products? • How does this product compare to others? • What are the most pressing usability issues with this product? • What is the user experience when using the product overall? (Think of unboxing)
Evaluation Planning • Determine the goals . • Explore the questions . • Choose the evaluation methods . • Identify the practical issues . • Decide how to deal with the ethical issues. • Evaluate, analyze, interpret and present the data .
Case Study: DreamGigs • Any ethical issues here in terms of evaluation?
Example Planning Questions • Why are we evaluating the design? • Who are the users/participants? • How many participants are needed? • What is the budget/timeline/resources? • What evaluation technique should we use? • What kind of data should we collect?
How do you collect data? • Observe users • Ask users for their opinions – interviews/surveys/think aloud/log usage • Ask experts for their opinions • Test users performance
So how do we evaluate our designs?
First we need metrics for evaluation
Usability metrics • No of keystrokes • Time to complete a task • User satisfaction • Error rates • Physiological measures • No of mouse clicks etc A usability metric reveals something about the interac@on between the user and the system
What are guiding principles for usability metrics? • E ff ectiveness – How well can the user complete the task? • E ffi ciency – What is the amount of e ff ort to complete the task? • Satisfaction – How satis fi ed/dissatis fi ed was the user while completing the task • Safety • Learnability • Memorability
Case Study: DreamGigs • Initially performed a usability test with 5 social workers and used a “think aloud” study • This means they asked users to use the storyboard and say what they are thinking at each step • Need to capture all this data • Also usually use a post-study questionnaire • Is this formative or summative? • What metrics could they have used?
Basic Usability/Lab Study
What are user experience metrics? • Pleasurable • Rewarding • Fun • Provocative • Empowering • Enlightening
Case Study: DreamGigs • Not just whether the tool will serve a purpose but also about… • How participants felt using it – E.g. seeing that you can explore jobs that you did not think of for your skill set – “expanding your horizons” • Used HCI empowerment framework for this part
Other Evaluation Methods • Inspection based methods – Based on skills and expertise of evaluators (no users) • Empirical methods – Test with real users
Inspection Methods • Known as Expert Review/discount usability methods 1. Heuristic evaluation (developed by Jakob Nielsen) 2. Cognitive Walkthroughs
Heuristic evaluation Assess interface based on predetermined • criteria Have small set of evaluators examine the • interface and judge compliance with recognized usability principles Di ff erent evaluators fi nd di ff erent problems • Aggregate fi ndings • Use fi ndings to fi x issues/redesign • Can be used throughout user-centered design • process
Cognitive Walkthrough • Put yourself in the shoes of the user • Construct carefully designed tasks to perform on system spec/mockups • Walk through activities required to go from one screen to another (cognitive and operational) • Review actions needed for each task • Attempt to predict how users will behave and what they will encounter
Inspection Methods • Pros? – Faster • HE is 1-2 hours per evaluator vs days/weeks – HE + CW do not require interpreting user data – Better than no evaluation • Cons? – HE may miss problems + misidentify issues – User testing more accurate
Field Study Example
Case Study: DreamGigs • Field study? – Pros? – Cons?
Other Popular Techniques
Case Study: DreamGigs • How else could they have evaluated DreamGigs? • What limitations are there to the work as it is presented? • Any other questions on DreamGigs?
Summary • Evaluation is a key part of user-centered design • Type depends on goals and system being tested • Choose depending on resources and what you want feedback on • Inspection methods do not require as many resources as user testing • They are “discount” usability methods but not without limitations • Field studies require a lot more work and use surveys, interviews, think alouds, and logs to gather data
Coming up next class • Project team discussions • Come to class – Ensure that your group checks in with one of the TAs on your project progress – TAs have a short checkpoint form – Q&A with TAs • Turn in GP0
Get in touch: O ffi ce hours: Fridays 2-4pm (Sign up in advance) or by appointment JCL 355 Email: marshini@uchicago.edu
Recommend
More recommend