REFACTOR THE TESTWARE
Testware must be refactored regularly just as application code.
'James Tony': Tests have technical debt too! It may be harder to measure than the technical debt of the code being tested, but cutting corners when writing tests can lead to a build-up of technical debt that can slow you down to a crawl in the future
Use this pattern for long lasting automation projects in order to keep maintenance costs low. For short lived solutions you will not need it
Refactoring means that scripts and test data should be checked regularly:
- are the automated test cases still valid and useful? .
- can the automation scripts be improved?.
- can similar scripts be merged?
- is the data up to date?
- is the documentation up to date?
- is the technical implementation as efficient as it can be? For example, wait times, communication methods etc. See article mentioned under Experiences below.
After you have determined what should be refactored schedule when to execute the updates and control that they get done:
- KILL THE ZOMBIES
- Improve the scripts (KEEP IT SIMPLE)
- Upgrade test data to the current release
- Remove or merge duplicate scripts so that you only have to maintain one version.
- Check that you are using the RIGHT INTERACTION LEVEL
Refactoring will often be postponed when a project is running out of time. Be careful not to build up too much technical debt
Issues addressed by this pattern
Dot: I came across an article on LinkedIn by Bhushit Joshipura, called "Efficient Running of Test Automation". He describes a series of technical "fine-tuning" that enabled him to reduce the execution time of a set of automated tests from 4.5 hours to less than 1.5 hours. A good example of technical re-factoring. You can read it here:
If you have also used this pattern and would like to contribute your experience to the wiki, please go to Feedback to submit your experience or comment.