Design Patterns
Jump to navigation
Jump to search
Design patterns show how to design the test automation testware so that it will be efficient and easy to maintain.
[1] Suggested by Thorsten Schönfelder
[2] This pattern is known as PAGE OBJECT in the Selenium Community and has been suggested by Lisa Crispin
[3] Suggested by Bryan Bakker
[4] Suggested by the testing team at BREDEX
Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)
The table below gives a list of the Design patterns with a short description.The Design Patterns Mind Map Design Patterns Mind Map shows which other patterns are used by the design patterns
Pattern |
Description |
---|---|
ABSTRACTION LEVELS ABSTRACTION LEVELS |
Build testware that has one or more abstraction layers. |
AUTOMATE GOOD TESTS AUTOMATE GOOD TESTS |
Automate only the tests that bring the most Return on Investment (ROI). |
AUTOMATE THE METRICS AUTOMATE THE METRICS |
Automate metrics collection. |
CAPTURE-REPLAY CAPTURE-REPLAY |
Capture a manual test with an appropriate tool and replay it to run the test again. |
CHAINED TESTS CHAINED TESTS |
Automate the tests so that they run in a predefined sequence. |
COMPARISON DESIGN COMPARISON DESIGN |
Design the comparison of test results to be as efficient as possible, balancing Dynamic and Post-Execution Comparison, and using a mixture of Sensitive and Robust/Specific comparisons. |
DATA-DRIVEN TESTING DATA-DRIVEN TESTING |
Write the test cases as scripts that read their data from external files. |
DATE INDEPENDENCE DATE INDEPENDENCE |
Write your test cases to be date independent |
DEFAULT DATA DEFAULT DATA |
Use default data to simplify data input |
DESIGN FOR REUSE DESIGN FOR REUSE |
Design reusable testware. |
DOMAIN-DRIVEN TESTING DOMAIN-DRIVEN TESTING |
Develop a domain-specific language for testers to write their automated test cases with. |
DON'T REINVENT THE WHEEL DON'T REINVENT THE WHEEL |
Use available know-how, tools and processes whenever possible. |
FRESH SETUP FRESH SETUP |
Before executing each test prepares its initial conditions from scratch. Tests don’t clean up afterwards. |
INDEPENDENT TEST CASES INDEPENDENT TEST CASES |
Make each automated test case self-contained. |
KEEP IT SIMPLE KEEP IT SIMPLE |
Use the simplest solution you can imagine. |
KEYWORD-DRIVEN TESTING KEYWORD-DRIVEN TESTING |
Tests are driven by keywords that represent actions of a test, and that include input data and expected results. |
MAINTAINABLE TESTWARE MAINTAINABLE TESTWARE |
Design your testware so that it does not have to be updated for every little change in the Software Under Test (SUT). |
MODEL-BASED TESTING MODEL-BASED TESTING |
Derive test cases from a model of the SUT, typically using an automated test case generator. |
ONE CLEAR PURPOSE ONE CLEAR PURPOSE |
Each test has only one clear purpose. |
READABLE REPORTS READABLE REPORTS |
Design the reports to be easy for the recipient to read and understand. |
RIGHT INTERACTION LEVEL RIGHT INTERACTION LEVEL(3) |
Be aware of the interaction level of the test approach on the SUT and its risks (intrusion) |
SENSITIVE COMPARE SENSITIVE COMPARE |
Expected results are sensitive to changes beyond the specific test case. |
SHARED SETUP SHARED SETUP |
Data and other conditions are set for all tests before beginning the automated test suite. |
SINGLE PAGE SCRIPTS (2) |
Develop an automation script for each window or page. |
SPECIFIC COMPARE SPECIFIC COMPARE |
Expected results are specific to the test case so changes to objects not processed in the test case don't affect the test results. |
TEMPLATE TEST TEMPLATE TEST |
Define template test cases as a standard from which you can drive all kinds of test case variations. |
TEST AUTOMATION FRAMEWORK TEST AUTOMATION FRAMEWORK |
Use a test automation framework. |
TEST SELECTOR TEST SELECTOR |
Implement your test cases so that you can turn on various selection criteria for whether or not you include a given test in an execution run. |
TESTWARE ARCHITECTURE TESTWARE ARCHITECTURE |
Design the structure of your testware so that your automators and testers can work as efficiently as possible. |
THINK OUT-OF-THE-BOX THINK OUT-OF-THE-BOX |
The best automation solutions are often the most creative |
TOOL INDEPENDENCE TOOL INDEPENDENCE |
Separate the technical implementation that is specific for the tool from the functional implementation. |
VERIFY-ACT-VERIFY VERIFY-ACT-VERIFY (4) |
The action to test is surrounded by two verifications that check the initial and the final state |
Home
Back to Test Automation Patterns Test Automation Patterns
Back to Management Patterns Management Patterns
Forward to Execution Patterns Execution Patterns
[1] Suggested by Thorsten Schönfelder
[2] This pattern is known as PAGE OBJECT in the Selenium Community and has been suggested by Lisa Crispin
[3] Suggested by Bryan Bakker
[4] Suggested by the testing team at BREDEX
Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)