Introduction
MettleCI provides a comprehensive and sophisticated automated Unit Testing capability. The focus of MettleCI Unit Testing is create and publish, to Git, a set of artefacts (test specifications and associated test data) that embody a definition of (used to demonstrate) your job's correct functional behaviour. These artefacts can be propagated to downstream environments where they can be used to verify your job's consistent behaviour on DataStage platforms which may not have the same configuration, connectivity, or even DataStage version as the project in which the original job was developed.
The Unit being tested by MettleCI's Unit Test capability is an individual DataStage job. Unit testing is not intended to replace performance testing, nor does it support connecting to source or target databases to validate SQL queries. Broader-scoped testing activities (typically involving database interaction) is the role of MettleCi's End-to-End tests which are created, managed, and executed separately to Unit Tests.
Test data can be derived using a number of methods, each of which may be used in combination with one another:
- Entered manually into the Excel-like test data file tables in MettleCI Workbench
- Provided as CSV file which are imported into the relevant test data file using MettleCI Workbench
- Fabricated using MettleCI Workbench's Data Fabrication capabilities (illustrated in the diagram above)
- Captured from your existing unit test data sources using MettleCI's Data Interception capabilities (see below)
Before using MettleCI automated Unit Testing you'll need to configure your environment for Unit Testing.
Creating an Automated Unit Test
Unit tests can be created for individual jobs using the MettleCI workbench, or en masse (for all jobs in a project) using an automated build pipeline. When MettleCI creates a unit test, using either method, it executes the following process for each job:
- Interrogate your job definition in the DataStage repository and identify each job's source and target stages.
- Read the metadata definition of each source stage input link and target stage output link. Note that each source may supply multiple output links, and each target stage may accept multiple input links.
- Create an empty unit test data file for each source and target link, with appropriate columns and metadata.
- Read your job's list of parameters
- Create a unit test specification which provides references to all of your job's parameters as well as each newly-created unit test data file.
This animation illustrates the conceptual steps MettleCI goes through to create your Unit Test:
See how this is done from the MettleCI workbench in Creating a Unit Test.
Executing an Automated Unit Test
(Execution description here)
See how this is done from the DataStage Designer in Executing a Unit Test.
Capturing Unit Test Data
(Interception Description Here)
See how this is done from the DataStage Designer in Intercepting Unit Test Data.
Fabricating Unit Test Data
See how this is done from within the MettleCI workbench in Fabricating Unit Test Data.