Process Issues

From Test Automation Patterns
Revision as of 13:00, 30 April 2018 by Seretta (talk | contribs) (Contour lines added)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
Process issues list the test automation problems that occur when the test automation process has not yet been established or hasn’t yet reached the necessary maturity. The table below gives a short list of the issues. Clicking on an issue shows more detail and the patterns needed to solve it.
The Process Issues Mind Map shows an overview of the Process Issues, together with the top level of resolving patterns, both the Most Recommended and other Useful patterns.

Issue
Description
AUTOMATION DECAY
Automation hasn't been or is not properly maintained and gradually decays to shelfware
BUGGY SCRIPTS
Automation scripts are not tested adequately
DATA CREEP
There are countless data files with different names but identical or almost identical content
INADEQUATE COMMUNICATION
Testers don’t know what automation could deliver and the test automation team doesn’t know what testers need
or
Developers don't understand, don't know or don't care about the effect of their changes on the automation
INADEQUATE DOCUMENTATION
Test automation testware (scripts data etc.) are not adequately documented
INADEQUATE REVISION CONTROL
Test automation scripts are not consistently paired to the correct version of the Software Under Test (SUT).
INSUFFICIENT METRICS
Few or no metrics are collected or not consistently. Collection is cumbersome.
LATE TEST CASE DESIGN
Automated test cases are designed and written only after the Software Under Test (SUT) has been implemented.
NO INFO ON CHANGES
Development changes are not communicated to the test automators, or not in good time.
NON-TECHNICAL-TESTERS
Testers aren’t able to write automation test cases if they are not adept with the automation tools.
SCRIPT CREEP
There are too many scripts and it is not clear if they are still in use or not.
STALLED AUTOMATION
Automation has been tried, but it never "got off the ground".
TEST DATA LOSS
Test data isn’t secured so that it has to be generated again and again.
TOOL-DRIVEN AUTOMATION
Test cases are automated using “as is” the features of a test automation tool
UNFOCUSED AUTOMATION
What to automate is selected ad-hoc.

Main Page
Back to Test Automation Issues
Forward to Management Issues