automated testing laboratory for embedded linux
play

Automated Testing Laboratory for Embedded Linux Distributions . - PowerPoint PPT Presentation

Automated Testing Laboratory for Embedded Linux Distributions . Pawe Wieczorek October 11, 2016 Samsung R&D Institute Poland Agenda . 1. Introduction 2. Motivation 3. Automation opportunities with our solutions 4. Future plans 5.


  1. Automated Testing Laboratory for Embedded Linux Distributions . Paweł Wieczorek October 11, 2016 Samsung R&D Institute Poland

  2. Agenda . 1. Introduction 2. Motivation 3. Automation opportunities with our solutions 4. Future plans 5. Conclusion 1/42

  3. Introduction .

  4. Automated Testing Laboratory . 2/42

  5. Actual Automated Testing Laboratory . 3/42

  6. Automated Testing Laboratory – MinnowBoard Turbot . 4/42

  7. Automated Testing Laboratory – Odroid U3+ . 5/42

  8. Automated Testing Laboratory – HiKey . 6/42

  9. Automated Testing Laboratory – Supporting hardware . 7/42

  10. Automated Testing Laboratory – SD MUX . 8/42

  11. SD MUX . 9/42

  12. Motivation .

  13. Change life cycle . 10/42

  14. Change acceptance . 11/42

  15. Release engineering . 12/42

  16. Primary tools . Open Build Service Jenkins 13/42

  17. Release Engineer role . 1. Release engineer investigates build failures (if any) 2. Release engineer checks whether new images introduce any regressions 3. Release engineer approves inclusion of verified changes to the main repository 14/42

  18. Release Engineer headache . • Complete image testing on multiple devices takes much time: t total = t download + n targets × ( t flash + t test ) • Monotonous – involves repeating the same set of actions • Requires focus – processing similar results calls for an observant person 15/42

  19. Release Engineer dilemma . 1. Can we test images less frequently? 2. Can we run fewer tests on new images? 3. Can we assume that successfully built packages work properly? 16/42

  20. Release Engineer credo . 1. Resolve an issue as soon as it is discovered 2. Look for a solution, not just workaround 3. Don't release software that was never run on an actual device 17/42

  21. Room for improvement . E T A M . . • Complete image testing on multiple devices takes much time: O t total = t download + n targets × ( t flash + t test ) T U • Monotonous – involves repeating the same set of actions A • Requires focus – processing similar results calls for an observant person 18/42

  22. Automation opportunities with our solutions .

  23. Automation tasks categories . • Software • Infrastructure • Internal • External • Hardware 19/42

  24. • Getting new images from OBS • Controlling hosts and targets • Publishing test results • Flashing target devices with new images Automation tasks examples . • Polling OBS for new images • Software • Infrastructure • Internal • External • Hardware 20/42

  25. • Controlling hosts and targets • Publishing test results • Flashing target devices with new images Automation tasks examples . • Polling OBS for new images • Software • Getting new images from OBS • Infrastructure • Internal • External • Hardware 20/42

  26. • Publishing test results • Flashing target devices with new images Automation tasks examples . • Polling OBS for new images • Software • Getting new images from OBS • Infrastructure • Internal • Controlling hosts and targets • External • Hardware 20/42

  27. • Flashing target devices with new images Automation tasks examples . • Polling OBS for new images • Software • Getting new images from OBS • Infrastructure • Internal • Controlling hosts and targets • External • Publishing test results • Hardware 20/42

  28. Automation tasks examples . • Polling OBS for new images • Software • Getting new images from OBS • Infrastructure • Internal • Controlling hosts and targets • External • Publishing test results • Hardware • Flashing target devices with new images 20/42

  29. Software – polling OBS and getting new images . • OBS lacks event mechanism • Human-readable naming conventions require parsing • New image discovery is run on multiple levels • Scheduling tasks Jenkins • Queueing tasks 21/42

  30. Internal infrastructure – reliable communication with devices . OpenSSH Serial console • Depends on other services • Lower rate of data transfer • Requires network connection • Less flexible than alternatives Default choice � �� � SDB (Smart Development Bridge) 22/42

  31. Internal infrastructure – configuration management . • Testlab-handbook on its own is not enough • All changes in configuration are tracked in Testlab-host • Improved deployments • No more snowflakes! 23/42

  32. External infrastructure – results publishing . • Easily available • With possibility for future reuse • Preferably using existing services • Sharing test environment information • Publishing test results MediaWiki edited • Providing data for future reuse by Pywikibot 24/42

  33. Hardware – flashing target devices with new images . • Current interface focused on user interaction • Designed for single target device per host • Architecture-specific procedure 25/42

  34. Hardware – SD MUX .

  35. Hardware – SD MUX . Board control

  36. Hardware – SD MUX . Memory card Board control

  37. Hardware – SD MUX . T arget SDB/card connection Memory card Board control

  38. Hardware – SD MUX . T arget SDB/card connection Memory card Board control Host card connection

  39. Hardware – SD MUX . T arget SDB/card connection Memory card Board control Host SDB/card access Host card connection

  40. Hardware – SD MUX . Power switch T arget SDB/card connection Memory card Board control Host SDB/card access Host card connection

  41. Controlling SD MUX . $ sdmuxctrl --help Usage: sdmuxctrl command -l, --list -i, --info -o, --show-serial -r, --set-serial=STRING -t, --init -u, --status (...) 33/42

  42. Former work flow . Requires release engineer's interaction 34/42

  43. SD MUX work flow . Fully automated process 35/42

  44. SD MUX – schematics .

  45. SD MUX – open-source . https://git.tizen.org/cgit/tools/testlab/sd-mux 37/42

  46. Future plans .

  47. What is next? . • Pre-test cases development • More detailed monitoring of differences between tested images • Improved fail management • Improved resource management • System distribution 38/42

  48. Conclusion .

  49. Summary . 1. No need for reinventing the wheel in modern automation 2. Custom hardware can simplify tasks 3. Automation pays off in the long term 39/42

  50. Questions? 39/42

  51. Thank you! Paweł Wieczorek p.wieczorek2@samsung.com Samsung R&D Institute Poland

  52. Further read • https://wiki.tizen.org/wiki/Laboratory • https://wiki.tizen.org/wiki/SD_MUX • https://git.tizen.org/cgit/tools/testlab

  53. Pictures used • https://wiki.tizen.org/w/images/9/95/Testlab.JPG • http://openbuildservice.org/images/obs-logo.png • https://wiki.jenkins-ci.org/download/attachments/2916393/logo.png • https://wiki.tizen.org/w/images/5/57/Tizen_Build_Process.gif • https://by-example.org/wp-content/uploads/2015/08/openssh-logo.png • https://pixabay.com/en/terminal-console-shell-cmd-dos-153150/ • https://pixabay.com/en/gears-options-settings-silhouette-467261/ • https://commons.wikimedia.org/wiki/File:Notification-icon-MediaWiki-logo.svg

Recommend


More recommend