automated testing
play

Automated Testing Elena Laskavaia March 2016 Quality Foundation - PowerPoint PPT Presentation

Winning the Battle against Automated Testing Elena Laskavaia March 2016 Quality Foundation of Quality People Process Tools Developers dont test Development vs Testers dont develop Testing Testers dont have to be


  1. Winning the Battle against Automated Testing Elena Laskavaia March 2016

  2. Quality Foundation of Quality People Process Tools

  3.  Developers don’t test Development vs  Testers don’t develop Testing  Testers don’t have to be skilled  Separate Developers and Testers  Make the Test team responsible for quality

  4. One Team Quality is a team responsibility

  5. The Process When quality is bad let's add more steps to the process

  6. Thousands of developers Story about the Continuous stability is a must broken Trunk “ Trunk is broken” too often Huge show stopper for R&D People did root-cause analysis Came up with Improved Process

  7. Pull/Update All Source “Improved” Clean compile All Pre-Commit Process Re-build and re-deploy whole system Manually execute sanity test cases Repeat for all hardware variants

  8. Process is too complex Process is too Developers are boring lazy Process was not Trunk is still followed Developers Process is too don’t know broken. Why? time consuming about the process Environment / Hardware limitations

  9. Automated Pre-Commit Testing

  10. Pre-Commit Tests with Source Management System master Checks Fix Push ● Peer reviews ● Robots checks

  11. Randomly pick a tool Spend 6 month developing testing framework Automation Hack Need a person to run it for every build Oops our tester quit, who knows how to run it? Let's slap on some automation! It does not work at all now! Oh well we don’t have any more budget and time, let go back to manual testing

  12. Continuous Testing Continuous Quality

  13. Cost of Tools • User Training • Cost of Integration and Customization • • Writing Test Cases Automation Executing Test Cases • Maintaining Test Cases •

  14. Make one team responsible Setup continuous integration Add pre-commit hooks Jump Start Establish simple self- verifying process Add one automated test

  15. Key Principles of successful automated testing

  16. Gate Keeper test system must guard the gate

  17. 100% Success 100% of tests must pass. zero tolerance

  18. • Remove such tests from automation NO random • Use repeaters to keep intermittent failures tests • Be prepared for the noise • Modify AUT to remove source of randomness for tests

  19. • No monkeys pushing buttons to start the testing • No monkeys watching automated UI testing • Hooks on code-submission (pre-commit, fast) Fully Automated • Hooks on build promotion (overnight)

  20. • Feedback for pre-commit <=10 min Fast and Furioius • Overnight is absolute maximum • More tests degrade the system response time • Not all tests are born equal! • Use tagging and filtering • Distribute or run in parallel • No sleeps

  21. • Make sure tests are not hanging! • Use timeouts • Use external monitors to kill hanging runs • Do not overestimate timeouts Timeouts

  22. • Automated test cases are programs Test Scripts are • Treat them as source code Programs • They must be in text form • They must go to same version control system • Subject to code inspection, coding standards, build checks, etc

  23. Unit Part of the process Tests • Mandatory with commit • Use servers to run Easy to write • Use a mocking framework • Use UI bot • Use test generators • Inline data sources

  24. • Unit tests cannot cover all • Test actual installed AUT Unit • Run the program as user would Integration • Use same language for unit and integration testing

  25. Candidates • Difficult to set-up cases Pick and Choose • Rare but important scenarios • Check lists you should not automate everything • Module is actively developed • Long maintenance expected

  26. Automatically Check  Code submission is properly formatted (has bug id, etc) Self-Verification:  Code submission has unit tests test the test system?  Total number of tests is increased  Performance is not declined  Code coverage is not declined

  27. Failed Battles

  28. Tools we used or evaluated and failed • after 3 month of • was pretty good • 4 years ago: • slow, not • domain specific writing tests until it was bought database, no text debuggable, language realized that it and it stopped for tests, no blocks on support. won’t work on launching with integration python. Linux new eclipse WinRunner WindowTester Jubula Squish RCPTT

  29. Working Solution

  30. • Unit testing • Source Control and Code • Static Analysis Continuous Review Integration Tools JUnit Git/Gerrit FindBugs • Build System • Continuous Integration • gui testing • maven-surefire-plugin (for Server • Gerrit Trigger plugin - pre- unit tests) • maven-failsafe-plugin (for commit builds and voting • FindBugs plugin - reports integration tests) • findbugs-plugin for static and status • JUnit plugin - reports and analysis status Maven/Tycho Jenkins SWTBot • junit mocking • Code Coverage • Lots of custom libraries, frameworks and bots Mockito Ecl Emma Custom

  31. Tips and Tricks

  32. Checks that can be added to every test Auto-Bots  App crashed during a test  Test timeout exceeded  App generated unexpected log  Temp files were not cleaned up  Resource or memory leaks  Runtime error detection

  33. AutoBots: Junit Rules public class SomeTest { // tests that we don’t leave tmp file behind (this is custom rule not // base junit) @Rule TmpDirectoryCheck tmpRule = new TmpDirectoryCheck(); @Test void testSomething(){ } } // base class with timeouts public abstract class TestBase { public @Rule Timeout globalTimeout = Timeout.seconds(1); // 1 second }

  34. Jenkins Split Verifiers Regression testing Linux To speed up verification for pre-commit hooks +1 set up multiple jobs which trigger on the Regression testing same event (i.e. patch verify Windows submitted) Static Analysis

  35. Inline Data Sources: Comments in Java // template<typename T2> // struct B : public ns::A<T2> {}; // void test() { // B<int>::a; // } public void testInstanceInheritance_258745() { getBindingFromFirstIdentifier("a", ICPPField.class); }

  36. Run tests with code coverage • Code Coverage Not during pre-commit check • Unless it has validation hooks • Good tool for unit test design • (IDE) Never ask for 100% coverage • Code Coverage -> Select Tests Based on changed code exclude tests that do not cover the changes

  37. • Can be run independently Static Analysis • Has to be a gatekeeper • Spent time to tune it (remove all noisy checkers) • Push to desktop (if running as you type - instantaneous feedback!) Jenkins Plugin: • Use alternative UX on desktop (i.e. Code Reviewer code formatter) Post defects from static analysis as reviewer comments on the patch

  38. Tagging and Filtering: Junit Categories // tag class with categories in test class @Category({ PrecommitRegression .class, FastTests.class}) public class SomeClassTest { @Test public void someTest() { } } // in maven pom.xml <build> <plugins> <plugin> <artifactId>maven-surefire-plugin</artifactId> <configuration> <groups>com.example. PrecommitRegression </groups> </configuration> </plugin> </plugins> </build>

  39. Runtime Filtering: Junit Assume // skip test entirely if not running in osgi @Before public void setUp() { Assume.assumeTrue( Activator.isOsgiRunning() ); }

  40. Intermittent Test: Junit Rule You can create a runner or define a rule which repeats a test if it fails. Junit itself does not define either, you have to add it yourself (2 classes, 62 lines) of code). public class SomeClassTest { public @Rule InterimittentRule irule = new InterimittentRule(); // repeat this up to 3 times if it failing @Intermittent(repetition = 3) @Test public void someTest() {} }

  41. One Team Simple Process The End Right Tools

Recommend


More recommend