Difference between revisions of "Design Patterns"

From Test Automation Patterns
Jump to navigation Jump to search
m (Contour lines added)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 
<div id="content_view" class="wiki" style="display: block"><span style="font-size: 16px">Design patterns show how to design the test automation testware so that it will be efficient and easy to maintain. <br />
 
<div id="content_view" class="wiki" style="display: block"><span style="font-size: 16px">Design patterns show how to design the test automation testware so that it will be efficient and easy to maintain. <br />
 
The table below gives a list of the Design patterns with a short description.The [[Design Patterns Mind Map]] shows which other patterns are used by the design patterns</span><br /> <br />  
 
The table below gives a list of the Design patterns with a short description.The [[Design Patterns Mind Map]] shows which other patterns are used by the design patterns</span><br /> <br />  
{| class="wiki_table"
+
{| class="wiki_table" border="1" style="border-collapse:collapse"
 
! <span style="font-size: 16px">Pattern</span><br />
 
! <span style="font-size: 16px">Pattern</span><br />
 
! <span style="font-size: 16px">Description</span><br />
 
! <span style="font-size: 16px">Description</span><br />
Line 65: Line 65:
 
| <span style="font-size: 16px">Design the reports to be easy for the recipient to read and understand.</span><br />
 
| <span style="font-size: 16px">Design the reports to be easy for the recipient to read and understand.</span><br />
 
|-
 
|-
| <span style="font-size: 16px">[[RIGHT INTERACTION LEVEL(3)]]</span><br />
+
| <span style="font-size: 16px">[[RIGHT INTERACTION LEVEL]](3)</span><br />
 
| <span style="font-size: 16px">Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)</span><br />
 
| <span style="font-size: 16px">Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)</span><br />
 
|-
 
|-

Latest revision as of 13:05, 30 April 2018

Design patterns show how to design the test automation testware so that it will be efficient and easy to maintain.

The table below gives a list of the Design patterns with a short description.The Design Patterns Mind Map shows which other patterns are used by the design patterns

Pattern
Description
ABSTRACTION LEVELS
Build testware that has one or more abstraction layers.
AUTOMATE GOOD TESTS
Automate only the tests that bring the most Return on Investment (ROI).
AUTOMATE THE METRICS
Automate metrics collection.
CAPTURE-REPLAY
Capture a manual test with an appropriate tool and replay it to run the test again.
CHAINED TESTS
Automate the tests so that they run in a predefined sequence.
COMPARISON DESIGN
Design the comparison of test results to be as efficient as possible, balancing Dynamic and Post-Execution Comparison, and using a mixture of Sensitive and Robust/Specific comparisons.
DATA-DRIVEN TESTING
Write the test cases as scripts that read their data from external files.
DATE INDEPENDENCE
Write your test cases to be date independent
DEFAULT DATA
Use default data to simplify data input
DESIGN FOR REUSE
Design reusable testware.
DOMAIN-DRIVEN TESTING
Develop a domain-specific language for testers to write their automated test cases with.
DON'T REINVENT THE WHEEL
Use available know-how, tools and processes whenever possible.
FRESH SETUP
Before executing each test prepares its initial conditions from scratch.
Tests don’t clean up afterwards.
INDEPENDENT TEST CASES
Make each automated test case self-contained.
KEEP IT SIMPLE
Use the simplest solution you can imagine.
KEYWORD-DRIVEN TESTING
Tests are driven by keywords that represent actions of a test, and that include input data and expected results.
MAINTAINABLE TESTWARE
Design your testware so that it does not have to be updated for every little change in the Software Under Test (SUT).
MODEL-BASED TESTING
Derive test cases from a model of the SUT, typically using an automated test case generator.
ONE CLEAR PURPOSE
Each test has only one clear purpose.
READABLE REPORTS
Design the reports to be easy for the recipient to read and understand.
RIGHT INTERACTION LEVEL(3)
Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)
SENSITIVE COMPARE
Expected results are sensitive to changes beyond the specific test case.
SHARED SETUP
Data and other conditions are set for all tests before beginning the automated test suite.
SINGLE PAGE SCRIPTS(2)
Develop an automation script for each window or page.
SPECIFIC COMPARE
Expected results are specific to the test case so changes to objects not processed in the test case don't affect the test results.
TEMPLATE TEST
Define template test cases as a standard from which you can drive all kinds of test case variations.
TEST AUTOMATION FRAMEWORK
Use a test automation framework.
TEST SELECTOR
Implement your test cases so that you can turn on various selection criteria for whether or not you include a given test in an execution run.
TESTWARE ARCHITECTURE
Design the structure of your testware so that your automators and testers can work as efficiently as possible.
THINK OUT-OF-THE-BOX
The best automation solutions are often the most creative
TOOL INDEPENDENCE
Separate the technical implementation that is specific for the tool from the functional implementation.
VERIFY-ACT-VERIFY (4)
The action to test is surrounded by two verifications that check the initial and the final state

Main Page
Back to Test Automation Patterns
Back to Management Patterns
Forward to Execution Patterns



[1] Suggested by Thorsten Schönfelder
[2] This pattern is known as PAGE OBJECT in the Selenium Community and has been suggested by Lisa Crispin
[3] Suggested by Bryan Bakker
[4] Suggested by the testing team at BREDEX
Be aware of the interaction level of the test approach on the SUT and its risks (intrusion)