today s lecture
play

Todays lecture ! User Testing User Testing ! Overview ! Examples ! - PDF document

Todays lecture ! User Testing User Testing ! Overview ! Examples ! Techniques CSE 510 ! Case study Richard Anderson Ken Fishkin Product development vs. User testing scientific studies Both valid dont knock dev ! Why do you want to


  1. Today’s lecture ! User Testing User Testing ! Overview ! Examples ! Techniques CSE 510 ! Case study Richard Anderson Ken Fishkin Product development vs. User testing scientific studies Both valid – don’t knock dev ! Why do you want to test users? ! Products ! Science Motivation Answer scientific question Promote a product Develop a product Evaluate a product Understand a work process Market research Emphasize different goals and methods Obvious point – goals of testing Products will be evaluated in the market place Influences methodology Scientific results must stand on their own Breadth of scientific results Aim to be broader than the particular artifact Damned if you do, damned if What is user testing? you don’t ! You will be critical of almost every user study that you read Laundry list Interviews, observations, Surveys, logs, measurements Video analysis, verbal Protocols, experiments, Artifact examination And almost everyone will be critical of yours, too “The food is awful and the portions are too small” 1

  2. Why are user studies hard? User studies for IO devices ! Achieving statistical significance ! How would you evaluate the NYU Quikwriting ! Confounding factors ! Often trying to measure a low order effect This is a speculative idea ! Users are not always easy to deal with What do you want to figure Out? ! Experimental design is not easy One obvious challenge is that It is slow to learn the device. ! Large resource requirements Can an expert use it effectively Is it learnable? Why or Why not. User studies for Tablet PC Ethical Considerations grading tool (paperless grading) ! TAs annotate CS1 assignments using ! Do not harm the participants Tablet PC Broad definition of Harm Pain, embarassment Quality of grading Discomfort Efficiency of Grading Do not rant about HSB Design and use of If HSB discussion Annotation There have been system serious problems HSB often oriented Towards medicine Bring up Hawthorne effect Informed consent Privacy ! Participant must be given full ! HSB very concerned about participant information privacy ! Ability to opt out (without penalty) ! Concerns about data linked to individuals ! Free from coercion ! Access to records ! Retention of information Optional – HS form – “Is Deception used in this study” Levels of deception Audio and video recordings, logged data 2

  3. Emphasize that other disciplines Have done a lot of work on these Basic techniques Mr. Wizard Testing Its easy to get these wrong ! Surveys ! Testing before building ! Unstructured interviews Examples of mockups Paper based studies Classic Clipee study ! Semi-structured interviews Verbal Protocols Thinking Aloud (cont.) ! Prompt the user to keep talking ! Need to know what users are thinking, not ! “tell me what you are thinking” just what they are doing ! Only help on things you have pre-decided ! Ask users to talk while performing tasks ! keep track of anything you do give help on ! tell us what they are thinking ! Recording ! tell us what they are trying to do ! tell us questions that arise as they work ! use a digital watch/clock ! tell us things they read ! take notes, plus if possible ! Make a recording or take good notes ! record audio and video (or even event logs) ! make sure you can tell what they were doing Done in real setting Design Experiments Ethnography ! Qualitative ! Immersive study ! Inform design of educational intervention 3

  4. Limit discussion of the system Case study Questions ! Does this work? ! Classroom Feedback System ! Does this improve the large lecture class? ! Student devices give ! Is this information valuable for instructors real time feedback ! Will students give useful feedback? to lecturer ! Feedback associated Questions of interest – but with slide content Avoid discussion of them Why we did not look at “learning outcomes” Main point of discussion – Mechanics of classroom evaluation Major undertaking – even for a Small number of classes Methodology The studies ! Design experiment ! Pen and paper ! Gather information from multiple sources ! CSE 100, 4 classes observed, 1 with ! Study the application in the real setting CFS ! Use results to alter the design of the ! CSE 142, 20 classes observed, 6 with intervention CFS ! Qualitative, not quantitative ! Classroom experiments, 10 students with laptops for feedback Methodology Data analysis ! Surveys ! Survey tabulation ! Instructor interviews ! Transcription ! With tape recording and transcript ! Coding of observations ! Detailed classroom observations ! Analysis of logs ! Instructor and student utterances ! Correlating events ! System logging 4

  5. Results Next Week ! Raw data analysis – ! February 17. President’s day, no class ! Basic class had low interaction rate ! February 19. Tangible Interfaces ! The system received modest usage – but ! Hiroshi Ishii and Brygg Ullmer, "Tangible Bits: did increase communication rate Towrads Seamless Interfaces between People, Bits, and Atoms". CHI '97 Conference Proceedings, ! Detailed analysis March 1997 ! Identified specific episodes of interaction ! Roy Want, Kenneth Fishkin, Anuj Gujar, Beverly ! Cases where instructor used feedback Harrison, "Bridging Physical and Virtual Worlds ! Discovery of usage patterns with Electronic Tags", CHI '99 Conference Proceedings, April 1999. ! Feedback lag ! Anticipatory feedback 5

Recommend


More recommend