/
DataStage Git Repository Structure
DataStage Git Repository Structure
John McKeever
Owned by John McKeever
The Git repository created and automatically maintained by MettleCI is structured like this:
.
├── datastage/ # This structure is automatically managed by MettleCI
│ ├── Config_One.apt # The apt config files for your project
│ ├── Config_Two.apt #
│ ├── ...
│ ├── DSParams # The DSParams for your project
│ ├── Jobs/ # | The structure under the root 'datastage' folder directly reflects
│ │ ├── Extract/ # | the structure in root of your DataStage repository. Every time
│ │ │ ├── ExtractJob_One.isx # | you check in datastage assets the required folders are created for
│ │ │ ├── ExtractJob_Two.isx # | you in your git repository.
│ │ │ └── ...
│ │ ├── Load/ #
│ │ │ ├── LoadJob_One.isx #
│ │ │ ├── LoadJob_Two.isx #
│ │ │ └── ...
│ │ ├── Transform/ #
│ │ │ ├── TransformJob_One.isx #
│ │ │ ├── TransformJob_Two.isx #
│ │ │ └── ...
│ │ └── ...
│ └── Parameter Sets/ # The 'Parameter Sets' folder contains Parameter Set Value files
│ ├── SourceDB1/ # | following the structure used by the DataStage Engine:
│ │ └── source1 # | '/datastage/Parameter Sets/<Parameter Set>/<Value File>'
│ ├── SourceDB2/ # |
│ │ └── source2 # |
│ ├── TargetDB/ # |
│ │ └── target # |
│ └── pGlobal/ # |
│ │ └── global # |
│ ├── SourceDB1.isx # | The Parameter Set export structure under the root 'datastage' folder
│ ├── SourceDB2.isx # | directly relects the structure of your DataStage repository.
│ ├── TargetDB.isx #
│ ├── pGlobal.isx #
├── filesystem/ # | The content of the file system directory is transferred to the
│ ├── deploy.sh # | DataStage engine by MettleCI and deploy.sh is invoked on the DataStage
│ ├── datasets/ # | engine to move other directories and files (such as scripts) to the
│ └── scripts/ # | appropriate locations for use by the ETL solution
│ ├── 10-restore-backups.sql #
│ ├── 20-create-test-data.sql #
│ └── ...
├── unittest/ # This flat structure is automatically managed by MettleCI
│ ├── ExtractJob_One/ # Each job gets its own folder in the unittest folder
│ │ ├── dsDataSource1.csv # A test data input file
│ │ ├── dsDataSource2.csv # A test data input file
│ │ ├── dsDataSource3.csv # A test data input file
│ │ ├── dsDataTarget1.csv # A test data output file
│ │ ├── ExtractJob_One.yaml # The test specification, associating files to your job's links
│ │ ├── ExtractJob_One_OtherTest.yaml # The test specification, associating files to your job's links
│ │ ├── ExtractJob_One_AnotherTest.yaml # The test specification, associating files to your job's links
│ ├── ExtractJob_Two/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_One/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_Two/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_One/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_Two/ #
│ │ ├── {similar to above} #
│ └── ...
├── var.ci # Variable override files provide environment-specific values for
├── var.dev # | each target deployment environment.
├── var.prod # | Search the MettleCI documentation for 'variable override files'
├── var.qa # | These files cover the CI, DEV, PRO, QA, and UAT environments
└── var.uat # | as an example
.
├── datastage/ # This structure is automatically managed by MettleCI
│ ├── Config_One.apt # The apt config files for your project
│ ├── Config_Two.apt #
│ ├── ...
│ ├── DSParams # The DSParams for your project
│ ├── Jobs/ # | The structure under the root 'datastage' folder directly reflects
│ │ ├── Extract/ # | the structure in root of your DataStage repository. Every time
│ │ │ ├── ExtractJob_One.isx # | you check in datastage assets the required folders are created for
│ │ │ ├── ExtractJob_Two.isx # | you in your git repository.
│ │ │ └── ...
│ │ ├── Load/ #
│ │ │ ├── LoadJob_One.isx #
│ │ │ ├── LoadJob_Two.isx #
│ │ │ └── ...
│ │ ├── Transform/ #
│ │ │ ├── TransformJob_One.isx #
│ │ │ ├── TransformJob_Two.isx #
│ │ │ └── ...
│ │ └── ...
│ └── Parameter Sets/ # The 'Parameter Sets' folder contains Parameter Set Value files
│ ├── SourceDB1/ # | following the structure used by the DataStage Engine:
│ │ └── source1 # | '/datastage/Parameter Sets/<Parameter Set>/<Value File>'
│ ├── SourceDB2/ # |
│ │ └── source2 # |
│ ├── TargetDB/ # |
│ │ └── target # |
│ └── pGlobal/ # |
│ │ └── global # |
│ ├── SourceDB1.isx # | The Parameter Set export structure under the root 'datastage' folder
│ ├── SourceDB2.isx # | directly relects the structure of your DataStage repository.
│ ├── TargetDB.isx #
│ ├── pGlobal.isx #
├── filesystem/ # The content of the file system directory is transferred to the
│ ├── deploy.sh # | DataStage engine by MettleCI and deploy.sh is invoked on the DataStage
│ ├── datasets/ # | engine to move other directories and files (such as scripts) to the
│ └── scripts/ # | appropriate locations for use by the ETL solution
│ ├── 10-restore-backups.sql #
│ ├── 20-create-test-data.sql #
│ └── ...
├── pipelines/ # A collection of unsupported pipeline definitions covering a range of use cases
│ ├── azure # Pipeline definitions for Azure Devops
│ ├── devops # Pipeline definitions covering the use DevOps use case (see description in files)
│ │ ├── devops.yml #
│ │ ├── hotfix_ci.yml #
│ │ └── hotfix_deploy.yml #
│ ├── templates # Reuseable pipeline components used by the example pipelines (see description in files)
│ │ ├── compliance-template.yml
│ │ ├── deploy-template.yml
│ │ └── unittest-template.yml
│ └── ...
│ ├── bitbucket/ # Example Build and Deploy Plan pipeline definitions for Atlassian Bitbucket
│ └── ...
│ ├── github/ # Example Actions pipeline definitions for GitHub
│ └── ...
│ ├── gitlab/ # Example pipeline definitions for GitLab
│ └── ...
│ └── jenkins/ # Example pipeline definitions for Jenkins
│ └── ...
├── unittest/ # This flat structure is automatically managed by MettleCI
│ ├── ExtractJob_One/ # Each job gets its own folder in the unittest folder
│ │ ├── dsDataSource1.csv # A test data input file
│ │ ├── dsDataSource2.csv # A test data input file
│ │ ├── dsDataSource3.csv # A test data input file
│ │ ├── dsDataTarget1.csv # A test data output file
│ │ ├── ExtractJob_One_Test.yaml # A test specification, associating files to your job's links
│ │ ├── ExtractJob_One_OtherTest.yaml # A test specification, associating files to your job's links
│ │ ├── ExtractJob_One_AnotherTest.yaml # A test specification, associating files to your job's links
│ ├── ExtractJob_Two/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_One/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_Two/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_One/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_Two/ #
│ │ ├── {similar to above} #
│ └── ...
│ └── varfiles/ # This flat structure is automatically managed by MettleCI
│ ├── var.ci # Variable override files provide environment-specific values for
│ ├── var.dev # | each target deployment environment.
│ ├── var.prod # | Search the MettleCI documentation for 'variable override files'
│ ├── var.qa # | These files cover the CI, DEV, PRO, QA, and UAT environments
│ └── var.uat # | as an example
└── README.md
, multiple selections available,
Related content
Creating and Preparing a Git Repository for DataStage Assets
Creating and Preparing a Git Repository for DataStage Assets
More like this
Working with DataStage and Git Branches
Working with DataStage and Git Branches
Read with this
Committing DataStage Assets to Git
Committing DataStage Assets to Git
More like this
Deploying DataStage Binaries
Deploying DataStage Binaries
More like this
Development Model Options & Recommendations
Development Model Options & Recommendations
More like this
Setting up a GitLab Project/Repository
Setting up a GitLab Project/Repository
More like this
© 2015-2024 Data Migrators Pty Ltd.