Difference between revisions of "AUTOMATE GOOD TESTS"
m (→Implementation) |
|||
(2 intermediate revisions by one other user not shown) | |||
Line 17: | Line 17: | ||
* <span style="font-size: 16px">Tests that require machine precision</span> | * <span style="font-size: 16px">Tests that require machine precision</span> | ||
* <span style="font-size: 16px">Repetitive, boring and time-consuming processing: set-ups for manual testing can also be automated</span> | * <span style="font-size: 16px">Repetitive, boring and time-consuming processing: set-ups for manual testing can also be automated</span> | ||
+ | * <span style="font-size: 16px">Interfaces / integration with other software / hardware</span> | ||
=<span style="font-size: 16px">'''Recommendations'''</span>= | =<span style="font-size: 16px">'''Recommendations'''</span>= | ||
− | * <span style="font-size: 16px">Remember the 80/20 rule: 20 % of the application accounts for | + | * <span style="font-size: 16px">Remember the 80/20 rule: 20% of the application accounts for around 80% of the usage. Automate the 20% only.</span> |
* <span style="font-size: 16px">Tests that cover critical areas in the Software Under Test (SUT) should be automated first.</span> | * <span style="font-size: 16px">Tests that cover critical areas in the Software Under Test (SUT) should be automated first.</span> | ||
* <span style="font-size: 16px">Not every manual test is suitable for automation, start with the most repetitive.</span> | * <span style="font-size: 16px">Not every manual test is suitable for automation, start with the most repetitive.</span> | ||
Line 43: | Line 44: | ||
As a comparison I often use the conveyor-belt in a factory: it might not speed up anything, because picking stuff up by hand and moving it about can be just as fast. But it does make work a lot less annoying, because carrying stuff isn't fun! Regression testing isn't fun! Doing stuff that hardly requires any brainpower isn't fun. Anything that makes that better has a massive ROI in my book.<br /> <br /> <br /> | As a comparison I often use the conveyor-belt in a factory: it might not speed up anything, because picking stuff up by hand and moving it about can be just as fast. But it does make work a lot less annoying, because carrying stuff isn't fun! Regression testing isn't fun! Doing stuff that hardly requires any brainpower isn't fun. Anything that makes that better has a massive ROI in my book.<br /> <br /> <br /> | ||
− | <br /> <span style="font-size: 16px">If you have also used this pattern and would like to contribute your experience to the wiki, please go to [[ | + | <br /> <span style="font-size: 16px">If you have also used this pattern and would like to contribute your experience to the wiki, please go to [[Feedback]] to submit your experience or comment.</span><br /> <br /> |
<span style="font-size: 14px">.................................................................................................................[[Main Page]] / Back to [[Design Patterns]] / Back to [[Test Automation Patterns]]</span><br /> <br /> [1]] Suggested at SIGIST Workshop 5 Dec 2013 </div> | <span style="font-size: 14px">.................................................................................................................[[Main Page]] / Back to [[Design Patterns]] / Back to [[Test Automation Patterns]]</span><br /> <br /> [1]] Suggested at SIGIST Workshop 5 Dec 2013 </div> |
Latest revision as of 10:23, 22 February 2019
Pattern summary
Automate only the tests that bring the most Return on Investment (ROI).
Category
Design
Context
This pattern is useful when you have to decide what to automate, so you should apply it not only when starting with test automation from scratch, but also every time you want to increase your automation
Description
Automate only the tests that bring the most return on investment. Of course first of all you must know your test cases (remember the warning by Dorothy Graham: Automating chaos just gives faster chaos). You may need to rework them first.
Implementation
Good candidates for automation are:
- Smoke tests: since they are run very often, they are also profitable fast
- Regression tests: as manual tests they are often boring and time-consuming and so are either performed poorly or not at all. They are well suited for automation since they usually test stable parts of the SUT and so the expected maintenance effort is reasonably small.
- Identical tests for different environments: automating such tests also pays off fast since the most effort is usually spent in automating the first environment.
- Complex tests: if manual testing is too difficult, it will not be performed.
- Tests that require machine precision
- Repetitive, boring and time-consuming processing: set-ups for manual testing can also be automated
- Interfaces / integration with other software / hardware
Recommendations
- Remember the 80/20 rule: 20% of the application accounts for around 80% of the usage. Automate the 20% only.
- Tests that cover critical areas in the Software Under Test (SUT) should be automated first.
- Not every manual test is suitable for automation, start with the most repetitive.
- Not every regression test should be automated from the start. It makes sense to start with the less complicated and proceed later to the more difficult ones.
- Don't automate only “strategic” applications, automate also the stuff that's boring to execute and could easily be automated [1].
- If more than one team is doing automation make sure that they don't automate the same tests.
Issues addressed by this pattern
HIGH ROI EXPECTATIONS
TOO EARLY AUTOMATION
Experiences
Bryan Bakker* says:I have participated in several projects where the reliability of the basic functions of a product were important attention points. These basic functions worked most of the time, but not always... When test automation started on these projects, I started with automating reliability tests. Some very simple and basic tests, which are executed over and over again to discover reliability issues:
- Very easy to automate these test cases
- Almost impossible to execute manually
- Numerous reliability issues which have already been reported by clients, could be reproduced by this approach, making it possible for developers to analyze and fix these problems
In these projects the right tests to automate were reliability tests.
As a secondary result the automation approach proved to be adding value and time and money became available to automate more tests (regression, smoke, performance).
Note: Bryan Bakker, along with colleagues Rob de Bie, Rene van den Eertwegh and Peter Wijhhoven, published a book in 2015 called: Finally … Reliable Software! A practical approach to design for reliability". This book explains reliability theory in an accessible way and uses an example case study to show how to do it.
Vincent Wijnen noticed that by automating the most boring or tedious test cases the number of testers leaving a project was noticeably reduced! Having to train fewer new colleagues because their work became more interesting again is also part of the ROI of test automation.
The value of work becoming more meaningful because boring and tedious testcases become more automated is impossible to express in an Excel-sheet, but it has a very important value for a team!
Another way to put it is that ROI is not only defined by strategic goals within a project. Nor by KPI-like measurements such as speed, efficiency and cost-reduction. It is also defined by the way work becomes more interesting due to automating the mundane stuff, which in turn could cause less testers leaving, less frustration and stress (therefore maybe even reducing sick-leave!). It is much broader.
As a comparison I often use the conveyor-belt in a factory: it might not speed up anything, because picking stuff up by hand and moving it about can be just as fast. But it does make work a lot less annoying, because carrying stuff isn't fun! Regression testing isn't fun! Doing stuff that hardly requires any brainpower isn't fun. Anything that makes that better has a massive ROI in my book.
If you have also used this pattern and would like to contribute your experience to the wiki, please go to Feedback to submit your experience or comment.
[1]] Suggested at SIGIST Workshop 5 Dec 2013