The Git repository created and automatically maintained by MettleCI is… is structured like this:
Expand |
---|
|
/.
├── datastage/ # This structure is automatically managed by MettleCI
│ ├── | default#│├──DSParamsapt config files for your project
│ ├── Config_Two.apt | #│├──Jobs/#│Extract/#│││├──ExtractJob_One# The DSParams for your project
│ ├── Jobs/ | #│││├──ExtractJob_Two# # | The structure under the root | │'datastage' folder directly reflects
│ │ | └── ...││├──Load/| the structure in root of your DataStage repository. Every time
│ │ │ ├── ExtractJob_One.isx | #│││├──LoadJob_One # | you check in datastage assets the required folders are | LoadJobExtractJob_Two.isx # | you in your git repository.
│ │ │ └── ...
│ │ ├── | Transform# TransformJobLoadJob_One.isx #
│ │ │ ├── | TransformJobLoadJob_Two.isx #
│ │ │ └── ...
│ │ | └── ...│└──ParameterSets/
│ │ │ ├── TransformJob_One.isx | #│├──SourceDB1/
│ │ │ ├── TransformJob_Two.isx | #││└──source1#│├──SourceDB2/││└──source2│├──TargetDB/ # The 'Parameter Sets' folder contains Parameter Set Value files
│ | ││└──target # | following the structure | │used by the DataStage Engine:
│ | └──pGlobal/ #│└──global # | '/datastage/Parameter Sets/<Parameter Set>/<Value File>'
│ ├── SourceDB2/ | #├──filesystem/#├──datasets├──deploy.sh#│└──scripts/#│├──10-restore-backups.sql├── 20-create-test-data.sql└── ...
├── unittest/│ └── global # |
│ ├── SourceDB1.isx # | The Parameter Set export structure under the root 'datastage' folder
│ ├── SourceDB2.isx # | directly relects the structure of your DataStage repository.
│ ├── TargetDB.isx #
│ ├── pGlobal.isx #
├── filesystem/ # | The content of the file system directory is transferred to the
│ ├── deploy.sh # | DataStage engine by MettleCI and deploy.sh is invoked on the DataStage
│ ├── datasets/ # | engine to move other directories and files (such as scripts) to the
│ └── scripts/ # | appropriate locations for use by the ETL solution
│ ├── 10-restore-backups.sql #
│ ├── 20-create-test-data.sql #
│ └── ...
├── unittest/ # This flat structure is automatically managed by MettleCI
│ ├── ExtractJob_One/ # Each job gets its own folder in the unittest folder
│ │ ├── dsDataSource1.csv # A test data input file
│ │ ├── dsDataSource2.csv # A test data input file
│ │ ├── dsDataSource3.csv # A test data input file
│ │ ├── dsDataTarget1.csv # A test data output file
│ │ ├── ExtractJob_One.yaml # The test specification, associating files to your job's links
│ │ ├── ExtractJob_One_OtherTest.yaml # The test specification, associating files to your job's links
│ │ ├── ExtractJob_One_AnotherTest.yaml # The test specification, associating files to your job's links
│ ├── ExtractJob_Two/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_One/ #
│ │ ├── {similar to above} #
│ ├── LoadJob_Two/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_One/ #
│ │ ├── {similar to above} #
│ ├── TransformJob_Two/ #
│ │ ├── {similar to above} #
│ └── ...
├── var.ci # Variable override files provide environment-specific values for
├── var.dev # | each target deployment environment.
├── var.prod # | Search the MettleCI documentation for 'variable override files'
├── var.qa # | These files cover the CI, DEV, PRO, QA, and UAT environments
└── var.uat # | as an example |
|
Expand |
---|
|
Code Block |
---|
.
├── datastage/ # This structure is automatically managed by MettleCI
│ ├── Config_One.apt # The apt config files for your project
│ ├── Config_Two.apt #
│ ├── ...
│ ├── DSParams # The DSParams for your project
│ ├── Jobs/ # | The structure under the root 'datastage' folder directly reflects
│ │ ├── Extract/ # | the structure in root of your DataStage repository. Every time
│ │ │ ├── ExtractJob_One.isx # | you check in datastage assets the required folders are created for
│ │ │ ├── ExtractJob_Two.isx # | you in your git repository.
│ │ │ └── ...
│ │ ├── Load/ #
│ │ │ ├── LoadJob_One.isx #
│ │ │ ├── LoadJob_Two.isx #
│ │ │ └── ...
│ │ ├── Transform/ #
│ │ │ ├── TransformJob_One.isx #
│ │ │ ├── TransformJob_Two.isx #
│ │ │ └── ...
│ │ └── ...
│ └── Parameter Sets/ # The 'Parameter Sets' folder contains Parameter Set Value files
│ ├── SourceDB1/ # | following the structure used by the DataStage Engine:
│ │ └── source1 # | '/datastage/Parameter Sets/<Parameter Set>/<Value File>'
│ ├── SourceDB2/ # |
│ │ └── source2 # |
│ ├── TargetDB/ # |
│ │ └── target # |
│ └── pGlobal/ # |
│ │ └── global # |
│ ├── SourceDB1.isx # | The Parameter Set export structure under the root 'datastage' folder
│ ├── SourceDB2.isx # | directly relects the structure of your DataStage repository.
│ ├── TargetDB.isx #
│ ├── pGlobal.isx #
├── filesystem/ # The content of the file system directory is transferred to the
│ ├── deploy.sh # | DataStage engine by MettleCI and deploy.sh is invoked on the DataStage
│ ├── datasets/ # | engine to move other directories and files (such as scripts) to the
│ └── scripts/ # | appropriate locations for use by the ETL solution
│ ├── 10-restore-backups.sql #
│ ├── 20-create-test-data.sql #
│ └── ...
├── pipelines/ # A collection of unsupported pipeline definitions covering a range of use cases
│ ├── azure # Pipeline definitions for Azure Devops
│ ├── devops # Pipeline definitions covering the use DevOps use case (see description in files)
│ │ ├── devops.yml #
│ │ ├── hotfix_ci.yml #
│ │ └── hotfix_deploy.yml #
│ ├── templates # Reuseable pipeline components used by the example pipelines (see description in files)
│ │ ├── compliance-template.yml
│ │ ├── deploy-template.yml
│ │ └── unittest-template.yml
│ └── ...
│ ├── bitbucket/ # Example Build and Deploy Plan pipeline definitions for Atlassian Bitbucket
│ └── ...
│ ├── github/ # Example Actions pipeline definitions for GitHub
│ └── ...
│ ├── gitlab/ # Example pipeline definitions for GitLab
│ └── ...
│ └── jenkins/ # Example pipeline definitions for Jenkins
│ └── ...
├── unittest/ # This flat structure is automatically managed by MettleCI
│ ├── ExtractJob_One/ # Each job gets its own folder in the unittest folder
│ │ ├── dsDataSource1.csv # A test data input file
│ │ ├── dsDataSource2.csv # A test data input file
│ │ ├── dsDataSource3.csv # A test data input file
│ │ ├── dsDataTarget1.csv # A test data output file
│ │ ├── ExtractJob_One_Test.yaml # A test specification, associating files to your job's links
│ │ ├── ExtractJob_One_OtherTest.yaml # A test specification, associating files to your job's links
│ │ ├── ExtractJob_One_AnotherTest.yaml # A | #ThisstructureisautomaticallymanagedbyMettleCIOne#dsDataSource1.csv #│dsDataSource2.csv#││├──dsDataSource3.csv#
│ │ ├── {similar to above} | #││├──dsDataTarget1.csv#││├──ExtractJob_One.yaml#ExtractJob_Two/ #││├──{similartoabove}LoadJobTransformJob_One/ #
│ │ ├── {similar to above} #
│ ├── | LoadJobTransformJob_Two/ #
│ │ ├── {similar to above} #
│ |
├──TransformJob_One#││├──{similartoabove} # This flat structure is automatically managed |
│ ├── TransformJob_Two/ #││├──{similartoabove} # Variable override files provide environment-specific values for
│ | │└──...ci# Variable override files provide environment-specific values foreach target deployment environment.
│ ├── var. | devprod # | Search the MettleCI documentation for 'variable override files'
│ | #eachtargetdeploymentenvironment.prodVariableoverride├── var.qa the CI, DEV, PRO, QA, and UAT environments
│ └── var.uat | #└──var.uat # |