t18
play

T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C - PDF document

BIO PRESENTATION PAPER T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C ASES Tanya Martin-McClellan Texas Mutual Insurance Company Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA Tanya


  1. BIO PRESENTATION PAPER T18 September 30, 2004 11:30 AM R EFINING R EQUIREMENTS WITH T EST C ASES Tanya Martin-McClellan Texas Mutual Insurance Company Better Software Conference & EXPO September 27-30, 2004 San Jose, CA USA

  2. Tanya Martin-McClellan Tanya Martin-McClellan is a Senior QA Specialist with Texas Mutual Insurance Company. She has played many roles in software development: tester, technical writer, project manager, product designer, technical support, and trainer. In those roles, she has seen just about everything that can go wrong do so. She collects requirements horror stories for entertainment and moral support. ☺ In her spare time, she blogs on LiveJournal as “happytester” and plays Dance Dance Revolution. Contact information: (512) 505-6275 tmartinm@texasmutual.com 221 W 6 th St Ste 300 Austin, TX 78701.

  3. Refining Requirements with Test Cases Tanya Martin-McClellan Senior QA Specialist Texas Mutual Insurance Company

  4. Why are requirements important? “The hardest single part of building a software system is deciding what to build. No other part of the work so cripples the resulting system if done wrong. No other part is more difficult to rectify later .” No Silver Bullet: Essence and Accidents of Software Engineering, Frederick P Brooks. Addison Wesley

  5. The challenge If we want to ensure that the requirements are as solid as they can be before we start testing, we have to start challenging them earlier. But how much earlier? Requirements Testing Design and Agreed on Project Prototype Begins Project Construction Ends Simplified waterfall methodology for reference

  6. Learning objectives • Spot ambiguity in requirements • Spot unaddressed requirement issues • Communicate issues effectively through test cases

  7. Assumptions • You have written requirements to work from • You get to plan your tests prior to the testing phase • You share your test designs with the rest of the project team • Your projects use a methodology that is known, documented, or otherwise accessible to you.

  8. Spotting ambiguity

  9. Weasel Words

  10. Requirement Short Checklist � Does it specify how the user will interact with the application? � Does it specify an expected system response? � Does it cover all expected exception handling and alternate paths? � Could you write a test case that would measure whether the requirement was met?

  11. Spotting unaddressed requirements

  12. Why would there be unaddressed requirements? • Failures of system or process • Failures of training • Specific failures

  13. How to miss functional requirements • Keep the users out of it • Lose track of requirements between gathering them and construction • Don’t think things through to their logical conclusions

  14. How to miss training and documentation requirements • Don’t ask about training or documentation • Don’t ask who the audience is • Don’t ask what the goal is

  15. How to miss usability requirements • Keep user input out of this – leave it to the architect or the developers • Don’t have a prototype • Don’t listen to prototype feedback if you do have a prototype • Don’t plan any usability testing • If you do plan usability testing, don’t account for time to make changes based on the results or retesting

  16. How to miss performance requirements • Don’t talk about it with users or IT operations • Consider it to be secondary to functionality • Don’t plan any performance testing in the project plan • Don’t account for additional cycles of improvement and retesting if you do

  17. How to miss security requirements • Consider it secondary to functionality • Keep the users out of it • Ignore the possibility of deliberate misuse • Ignore your confidential or protected data

  18. How to miss regulatory requirements • Don’t talk to your legal department about the project • Don’t read any legislative updates that relate to your industry

  19. How to spot what’s missing • Use a checklist of questions • Think every requirement through to its logical conclusion. Are the requirements suggesting something that isn’t explicitly stated?

  20. Communicate issues effectively through test cases

  21. Two ways to do this • Use a notes, risks, assumptions, or other related section of your document to communicate where the gaps are. • Make an assumption about what the answer should be to your question. Write your test case based on that assumption, and document the assumption.

  22. The final, vital step • Review your test cases with the project team. • Discuss the issues, questions, and assumptions that are part of your test cases. • Listen carefully to the responses you receive.

  23. Thank you! Y’all stop by and see me when you’re in Austin, y’hear?

  24. Generic Test Case List DRAFT Environment & Configuration Application can be installed 1 Application can be reinstalled 2 Application can be uninstalled 3 Application can be upgraded 4 Closing the session unlocks all locked records 5 More than one user may not update the same record at the same time 6 No unexpected crashes/environment is stable 7 No unexpected error messages are displayed 8 Searching for a record does not cause a record lock 9 Session times out within N minutes 10 Sessions closed without saving/submitting records do not lock records 11 Sessions closed without saving/submitting records do not update the data source 12 User attempting to update a record that is being updated by another user receives an appropriate error message 13 Valid user can open N sessions of the application 14 Valid user can open no more than Y sessions of the application 15 Viewing a record in display mode does not cause a record lock 16 When user selects a cancel or back out option, no updates occur to the record he/she is editing 17 Security Audit trail can be accessed 18 Audit trail contains sufficient information to determine who changed what information in the application, and when it was done 19 Invalid login attempts are logged 20 Invalid user cannot log in 21 Repeated failed login attempts will lock out user 22 User without access to function X receives error when attempting function X 23 User without access to menu option X cannot see menu option X when logged in 24 Valid user can log in 25 Basic functions Entry areas accept valid input of the expected data type when cut and pasted 26 Entry areas accept valid input of the expected data type when entered manually 27 Entry areas allow all valid values as input 28 Entry areas appropriately limit the size of inputs 29 Error message provides sufficient information to both instruct the customer on next steps and assist IT in diagnosing the cause of the error 30 Error message text matches the error situation 31 Field labels are accurate (match the fields they control) 32 Navigation options work consistently 33 No unexpected data is displayed 34 Online Help for page matches the page from which it is launched 35 Online Help launches correctly 36 Online Help matches the application from which it is launched 37 Position to option positions to the first record that matches what was entered 38 Page 1 of 2

  25. Generic Test Case List DRAFT Search results display results that match the passed parameters 39 Values entered into entry areas is saved appropriately 40 API accepts input as documented 41 API provides output as documented 42 API handles error conditions as documented 43 Usability & Standards “Look and feel” is consistent within the application 44 “Look and feel” of the application is consistent with other TMI products 45 Accessibility standards followed 46 Application can be navigated using only the keyboard 47 Application can be navigated using only the mouse 48 Error message text is spelled correctly 49 Error message text is understood by customer 50 Field labels are legible 51 Field labels are spelled correctly or abbreviated to generally accepted standards 52 Field labels are understood by target audience 53 Field labels clearly tie to entry areas 54 Navigation options are clearly labeled 55 Number of records displayed per page matches company standard 56 Online Help has no spelling or grammar errors 57 Online Help is business-appropriate 58 Online Help is understood by the customer 59 Users understand navigation options 60 Page 2 of 2

Recommend


More recommend