CS 102 Human-Computer Interaction Lecture 10: Heuristic Evaluation CS102: Monsoon 2015 1
Course updates Ubicomp reading assignment on course web site, due next Monday Oct 5, before class User understanding phase: writeup due Monday Oct 12 Hardware requirements for projects? Be sure to review lecture slides, references Experiment for us to receive anon feedback: https:// goo.gl/Vre9dT CS102: Monsoon 2015 2
Recap CS102: Monsoon 2015 3
Human factors in CS A human-centered design mindset, and skills like user observations, interviews, studies, prototype-and-feedback are essential In the last 25 years, human factors have entered the loop in nearly all areas of CS e.g. Programming languages, Computer Architecture, Operating systems, Debugging, Databases, Networks, Security, ML, … And of course commercial products as well! CS102: Monsoon 2015 4
Heuristic evaluation 5 CS102: Monsoon 2015
Heuristic evaluation A “discount” usability engineering method A small set of evaluators are independently asked to fi nd “bugs” in interfaces (usually w.r.t recognized usability principles) Di ff erent evaluators tend to fi nd di ff erent problems Question: How many evaluators should we use? CS102: Monsoon 2015 6 Heuristic Evaluation of User Interfaces
Experimental setup 4 interfaces with known problems (First 2 were screenshots, last 2 were live systems) CS102: Monsoon 2015 7 Heuristic Evaluation of User Interfaces
Who fi nds what? Savings experiment (37 subjects): CS102: Monsoon 2015 8 Heuristic Evaluation of User Interfaces
Problems found Savings experiment (37 subjects): CS102: Monsoon 2015 9 Heuristic Evaluation of User Interfaces
Cost to bene fi t CS102: Monsoon 2015 10 How to conduct a Heuristic Evaluation
Takeaways 5 evaluators will fi nd ~ 2/3rd of problems Can use this technique even without a live system CS102: Monsoon 2015 11
Which heuristics? Analysis of 249 usability problems in 11 projects. Each problem checked against 7 sets of guidelines (101 heuristics) Xerox Star, Macintosh, Sunsoft usability guidelines + Molich and Nielsen, Holcomb and Thorp, Polson and Lewis, Carroll and Rosson CS102: Monsoon 2015 12 Enhancing the Explanatory Power of Usability Heuristics
Broad categories What are the broad sets of usability problems? 7 buckets: • Error prevention • Visibility of system status • Recognition over recall • Match between system and real world • Flexibility and e ffi ciency • User control & freedom • Consistency and standards CS102: Monsoon 2015 13 Enhancing the Explanatory Power of Usability Heuristics
Top 10 heuristics Which heuristics catch how many problems? CS102: Monsoon 2015 14 Enhancing the Explanatory Power of Usability Heuristics
Best practices Typically 1-2 hour sessions Either evaluator writes up a report or is observed directly (may be ok for the observer to provide help) Evaluator goes through the interface several times and compares them with a list of heuristics (usability principles) Heuristics based on general + domain-speci fi c guidelines (e.g. derived from competitive analysis) CS102: Monsoon 2015 15 How to conduct a heuristic evaluation
Nielsen’s 10 heuristics 16 CS102: Monsoon 2015
Heuristic #1 Visibility of system status CS102: Monsoon 2015 17
Heuristic #1 General guideline: CS102: Monsoon 2015 18 http://asktog.com/atc/principles-of-interaction-design/
Heuristic #2 Match between system and the real world From a recent citizen’s survey: How important is: Use an outsider to test model of the real world CS102: Monsoon 2015 19
Heuristic #3 User control and freedom e.g. cancel button “please don’t close the browser window until the bank transaction is complete” “You weren’t supposed to press that button on this page!” CS102: Monsoon 2015 20
Heuristic #4 Consistency and standards Use same L&F Use consistent terminology Predictability of actions, locations of widgets, etc. CS102: Monsoon 2015 21
Heuristic #5 Errors: wrong mental model vs. slips: errors in execution Preventing slips: Add constraints that make it di ffi cult to commit them O ff er suggestions/auto-complete Use good defaults Be forgiving in syntax CS102: Monsoon 2015 22
Heuristic #5 CS102: Monsoon 2015 23 http://www.nngroup.com/articles/slips/
Heuristic #5 For client-server apps: client side checking is ok But in no case should security/data integrity depend on client side checks! CS102: Monsoon 2015 24
Heuristic #6 Recognition rather than recall CS102: Monsoon 2015 25 How to conduct a heuristic evaluation
Heuristic #6 Recognition rather than recall amazon.com CS102: Monsoon 2015 26 How to conduct a heuristic evaluation
Heuristic #7 Flexibility and e ffi ciency of use CS102: Monsoon 2015 27 How to conduct a heuristic evaluation
Heuristic #8 Aesthetic and minimalist design Good graphic design (catchall) Keep instructions short CS102: Monsoon 2015 28
Heuristic #9 Help users recognize, diagnose and recover from errors Think from user’s point of view: provide actionable advice Restate exactly what happened Shift blame to yourself Error has to be understandable by user/and or operator Hide technical details (stack trace) until requested CS102: Monsoon 2015 29
Heuristic #9 CS102: Monsoon 2015 30
Heuristic #10 Provision of help and documentation Help should be: Searchable Context-sensitive Task-oriented Concrete Short CS102: Monsoon 2015 31
In-class exercise 1. Visibility of system status 2. Match between system and real world 3. User control and freedom 4. Consistency and standards 5. Preventing errors 6. Recognition over recall 7. Flexibility and e ffi ciency 8. Aesthetic and minimalist design 9. Recover from errors 10. Help and Documentation CS102: Monsoon 2015 32
Recommend
More recommend