DATA-DRIVEN TESTING

From Test Automation Patterns
Jump to navigation Jump to search
.................................................................................................................Main Page / Back to Design Patterns / Back to Test Automation Patterns

Pattern summary

Write the test cases as scripts that read their data from external files

Category

Design

Context

One of the most used patterns to easily develop modular automation scripts for long lasting automation

Description

Write the test cases as scripts that read their data from external files. In this way you have only one script to drive the tests but by changing the data you can create any number of test cases. The charm is that if you have to update the script because of some change in the Software Under Test (SUT), you frequently don’t have to change your data so you don’t have to spend too much effort in maintenance.

Implementation

You write a script with variables whose content is read sequentially from a file such as a spreadsheet. Every line in the file delivers the data for a different test case.
An easy way to implement this pattern is to use CAPTURE-REPLAY to capture the tests initially. The captured test will have constant data (i.e. specific test inputs for every field). You can then replace these constants with a call to data that is read from an external file.

Potential problems

If your data is not contained in only one data file, you must make sure that the script and data are correctly matched.

Issues addressed by this pattern

BRITTLE SCRIPTS
REPETITIOUS TESTS

Experiences

Derek Bergin has some good advice for managing the test data from his experience - thanks!
Derek says: Selection of test data
A case can be made for a 3-tier system when selecting test data.

For ‘smoke tests’ then simply providing known good data is probably sufficient – after all you’re just trying to prove that this build isn’t fundamentally broken.

For regression tests then each variable should have successive entries that fall in the following categories: typical, just short of limit, on limit, over limit. The limit can be field size, input value etc. This is still just testing on a single axis though ….

For more complete testing you should attempt to have multiple failures happening at once. Using pairwise testing you should be able to fairly easily set up a test sequence to have each combination of failure points. Warning – this level of testing can take some time to run. I have had some very interesting cases of fault recovery routines ‘colliding’ when faced with this type of testing and it’s the sort of thing that drives support crazy when it’s encountered in the field.

Data Management
Once you move away from the smoke test level of testing then it becomes important to be able to manage the data sets you are using. Failure combination data sets can be large and may well be specific to a particular build and its limits. Similarly you may well have customer specific data sets which have to be validated at User Acceptance Testing. Your choice of test tools and framework should be informed by this requirement. At the very least the ability manipulate the data in a spreadsheet-like grid should be expected and link the files to a specific test cycle/revision.

Example
Seretta:

In my company we use a variation of KEYWORD-DRIVEN TESTING, but with an easy trick we have the same advantages as in DATA-DRIVEN TESTING, that is we can write for any one (DRIVER-)script any number of (DATA-)scripts. We reached this by substituting the data in the (DRIVER-)script with variables. To show how this works, here are some extracts from the scripts that we use to test our own test automation framework:

In the DRIVER-script we have for instance:
…….
GOTO,FTestSuite
INPUT,ComboBox,cboPriority,<Priority>
INPUT,ComboBox,cboTestType,<TestType>
SELECT,Button,<ButtonDRIVER>
GOTO,<SelectDirDRIVER>
INPUT,edtDirectory,<DRIVERDirName>
SELECT,Button,<ConfirmSelectionDRIVER>
…….

The words in the square brackets represent the variables.
In one of the corresponding DATA-scripts you would find the following data:

…….
<Priority>,High
<TestType>,Automatic
<ButtonDRIVER>,btnDRIVERDir
<SelectDirDRIVER>,TSelectDirDlg
<DRIVERDirName>,c:\General\Data\ScriptData
<ConfirmSelectionDRIVER>,btnOK
…….

If you have also used this pattern and would like to contribute your experience to the wiki, please go to Feedback to submit your experience or comment.

.................................................................................................................Main Page / Back to Design Patterns / Back to Test Automation Patterns