test automation on large agile projects it s not a cakew
play

"Test Automation on Large Agile Projects: It?s Not a Cakew - PDF document

AW12 Concurrent Session 11/7/2012 3:45 PM "Test Automation on Large Agile Projects: It?s Not a Cakew alk" Presented by: Scott Schnier Agilex Technologies Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888


  1. AW12 Concurrent Session 11/7/2012 3:45 PM "Test Automation on Large Agile Projects: It?s Not a Cakew alk" Presented by: Scott Schnier Agilex Technologies Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888 ‐ 268 ‐ 8770 ∙ 904 ‐ 278 ‐ 0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com

  2. Scott Schnier Agilex Technologies Scott Schnier is currently a senior agile practice manager at Agilex Technologies working with other courageous and passionate people to bring agile development practices to the Federal Government. He has held positions of software engineer, director of development, mentor, architect, director of quality assurance, project manager, program manager, agile coach, ScrumMaster, and proxy product owner. Scott’s varied work experience has found him in small startups, Fortune 500 firms, and as a government contractor. A founding member of the Cincinnati chapter of ALN, Scott is currently active in the Washington DC chapter. Scott takes special pleasure in and has a passion for helping people work better together. .

  3. 9/3/2012 T est Automation on Large Agile Projects, It’s not a Cakewalk Scott Schnier Agilex T echnologies Scott.Schnier@Agilex.com 1 The Story  T est Automation ◦ Growth and division of work and responsibility ◦ Support of Agile Values ◦ Lessons learned and victories  There are no “best practices” 2 1

  4. 9/3/2012 It’s not a cakewalk 3 The Setting  Core group of Agile/Scrum practioners  Many staff new to Agile/Scrum  Customer willing to try something new, frustrated with past failures.  Government contract, multiple vendors.  Driven by Legislation 4 2

  5. 9/3/2012 Geography 5 Scrum T eam Evolution 1 3 5 2 4 6 9 8 7 D 13 11 R 12 10 S After 2 ½ years 6 3

  6. 9/3/2012 T est Automation - Why ? 7 Keep Development on Track 8 4

  7. 9/3/2012 Capture Knowledge 9 Automate repetitive work 10 5

  8. 9/3/2012 “Working software over comprehensive documentation” How do you know it still works today? 11 Why Automate T ests?  Move discovery of defects to the left ( earlier)  Respond to emergent requirements  Capture Intellectual property (test skill)  Enable test engineers to focus on creative work not repetitive testing tasks.  Make specifications executable and a trusted way to understand the impact of change. 12 6

  9. 9/3/2012 T est Vision  The scrum team is the value creation engine  T ests are best created in the Scrum team  T est is a skill not a role  Need to support test while making the scrum team primary 13 Issues on the Journey  T est debt accumulation  Specialized testing tools contribute to segmentation of responsibility  People who do functional testing straddling more than one team 14 7

  10. 9/3/2012 Managing T est Debt  Organization  Definition of “Done” 15 How do we get test debt How many Oh… 8 points points is plus testing that story? 16 8

  11. 9/3/2012 T est Debt avoidance  Size of a story is more accurate with an integrated discussion regarding all of the test and product work. 17 Regression test debt  The story is complete but 30% of the regression tests are broken. 18 9

  12. 9/3/2012 Definition of “Done”  To be complete - tests must be done and running green on the Continuous Integration pipeline.  If we make an exception - then “test fixing” stories should be estimated and in the backlog so PO’s can agree to the exception. 19 T esting work straddling teams 20 10

  13. 9/3/2012 Traditional SDLC Workstreams 21 A more Agile Organization 22 11

  14. 9/3/2012 Where is testing done?  In the scrum team  Recall definition of done  What happens to the regression tests that accumulate? 23 Organization for T est Management 9 Scrum 1 7 2 Partitioned rotating triage/  6 3 5 leadership 4 End of sprint handoff 9 Completely partitioned Scrum  1 7 2 Autonomous teams 6 3 5 4  Scrum teams Plus System Test STI Integration  Dedicated Maintenance Team Maint’ 24 12

  15. 9/3/2012 Lessons  Organize test sets to support scrum team affinity.  With more than ~5 scrum teams an integration or “DevOps” team is necessary and good.  Listen for and stamp out opportunities for debt to accumulate.  T est Community of Practice is valuable and takes work to be effective 25 Key challenge  Design organization/responsibility so that test debt does not accumulate. 26 13

  16. 9/3/2012 Shared responsibility  Functional scrum  System T est team integration ◦ Develops tests ◦ Maintains test framework/standards ◦ Executes acceptance ◦ Executes regression tests tests – All ◦ Promotes to regression ◦ Creates and maintains ◦ Monitors selective reusable test regression projects components. ◦ Performs impact analysis ◦ Helps functional scrum of new functions/fixes teams with massive regression breakages. 27 Development workflow 1.Update personal workspace 2.Implement change (Story/task, defect, test) 3.Test locally, unit, integration, smoke 4.Update personal workspace 5.Resolve conflicts 6.Commit change to repository & test pipe line 7.Monitor key test projects 8.Revert or fix any problems 28 14

  17. 9/3/2012 A slice of life Skype Chat snippet [7/2/2012 12:23:28 PM] Dan : Trunk ‐ Dev is broken [7/2/2012 12:23:45 PM] Dan : anyone working for a fix? [7/2/2012 12:24:11 PM] Dan : …. ERROR :…. …. [7/2/2012 12:27:39 PM] Steve : I'm working with Mike on resolving it [7/2/2012 12:30:10 PM] Dan : Thanks 29 Continuous Integration status 30 15

  18. 9/3/2012 Continuous Integration  Release Build - Integration and Unit tests (1000’s)(commit package)  Rapid Smoke T est and Regression Lite (20 min) (commit package)  Smoke on each functional test server(>= daily)  Regression Lite on other Browsers(>= daily)  Regression Heavy (2 hours) (>=daily)  All other automated regression tests 1000’s(daily)  Semi- automated (100’s) & Manual tests (10’s)(delivery at least)  Ad-Hoc testing 10’s of hours( each delivery) ◦ Human intuition, UI (CSS & other risks) 31 Ongoing challenge As system gets larger individuals are • less likely to feel responsible or capable of fixing broken tests. • Start with one, two then three functional scrum teams. • Months latter a team forms that specializes on a particular component of the system. • A few more months another component team arises…. Becomes harder to maintain the • social norms of “Stop and Fix “ The technical challenges also increase • when the system complexity grows. The need and the risk of integration • problems also grow at the same time. 32 16

  19. 9/3/2012 Challenges of Scale  People are socially more distant  T echnical skills become more focused  Accountability becomes more elusive  One mistake can impact more people, makes actions more conservative, slows velocity 33 Wisdom  When number of teams exceeds 7 +/-2 need a system level team focused on test assets/regression  T est Community of Practice is essential  Organize test sets with an affinity for teams or system components.  As the program gets larger needs to have a team with gentle authority to ensure consistently  Ultimately with > 10 teams will need to consider multilevel integration.  Performance/stress test is a separate team. 34 17

  20. 9/3/2012 Managing – T ool Specialists  Common tool platform for test developers and product developers  Increase the pool of people who can create or fix a test. 35 Why a “new” framework  Conejo T est Framework motivators ◦ Eliminate barriers to “every one is a tester” ◦ Enable data driving for more resilient tests ◦ Integrate multi modal testing into one coherent framework Web UI, component, web services, manual. ◦ Support the workflow from acceptance test development to regression testing to obsolesce ◦ Integrate all test assets 36 18

  21. 9/3/2012 T est design Goals  Minimize collateral code, focus on test target.  Enable tests to quickly respond to changes in Application Under T est  Easy to understand when it breaks (reuse common patterns, canonical test classes) 37 T est types Unit Integration Functional Scope Class Component(s) System Persistence No Maybe Yes Author Self Anyone Not the Author Tests system interface No No Yes Traceable to epic, story No Maybe Yes or defect Execution Pre release Pre release build Post release build* build 38 19

  22. 9/3/2012 T est Architecture Tests Utility Classes Interface Classes Se Test Framework JUnit Application Under Test 39 T est Execution @Test @TestHeaderInfo(description = "Test Buyer Sequence", … functionalArea = {"Buyer" , "Order" } ) public void testBuyerSequence() { //Test code goes here.... } @Test @TestHeaderInfo(description = "Test Seller Sequence", … functionalArea = {"Seller" } ) public void testSellerSequence() { //Test code goes here.... } mvn test ‐ Dconejo.filter.functionalAreas="Seller" 40 20

  23. 9/3/2012 Product and “Anti-Product” Product DBA Operations Architecture Architecture Operations DBA Test 41 It’s not a cakewalk 42 21

Recommend


More recommend