Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Table of Contents
minLevel1
maxLevel7

Structure

The Jenkins DevOps pipeline is defined in a “Jenkinsfile” ‘Jenkinsfile’ (by with the default named filename Jenkinsfile ) and has the following structure (most detail omitted). Paired curly braces{ } an overall structure summarised below. Note that paired curly braces - { } - denote the start and end of a scoped section. Comments can be created in a block (/* comment like this */) or inline (// comment like this).

Code Block
pipeline {                    /* outermost scope    # Outermost scope
    agent { ... }      */     agent { ... }     # What are the default agent(s) upon which /* where will this run?               */Stages execute?
     parameters { ... }        /* What is it being fed? # What values are required if the pipeline is invoked interactively?
  */     environment { ... }       /* what other variables are needed? # What */(global) variables are needed?
 stages {  stages   {             /* operational container         # Operational container
  */           stage { ... }       /* first sequence of steps    # First Stage (sequence of executable steps)
 */           stageagent { ... }       /* next   # Which agent is used to execute this Stage?
            environment { ... }    */ # What variables are needed? (local to this stage)
  ...          }steps {
                { step }     /* end of stages    # Executable step
               */
}

To understand these sections in more detail, refer to your Jenkins documentation. We present only the MettleCI specific things you need to know. Note that Jenkins offers flexibility in where some of these sections are placed, and in some cases, allows repetition at different scopes. (The {} are scope delimiters.) Note also that there are additional things you could put in to just about every section, and that some sections may be repeated at a given scope level.

Sections before “stages”

Agent section

The agent section denotes what class of machines your pipeline can be executed on. For a CI/CD pipeline, you only need to specify this once since all projects are at the same release level. For an upgrade pipeline you may want to specify the agents within the stage so that different stages run on different release levels. In the example below the label should match the label you gave your agent on the MettleCI host when you set it up. Pipelines will stall if no running agent with the (matching set of) label(s) is found.

Code Block
    agent ...
        }                           # End of first Stage
        stage { ... }               # Second Stage
        ...
    }                               # End of Stage definitions
}                                   # End of Pipeline

Note that Jenkins offers some flexibility in the structure of the pipeline, and in some cases allows repetition at different scopes. For example, a single Step can optionally contain another Stages {...} container which defines more sub-Stages). To understand these sections in more detail refer to the Jenkins documentation.

Input Values

Parameters section

The parameters section enumerates the parameters to the pipeline with their default values. Note that parameters values are immutable, and once a pipeline is invoked with parameter values those values cannot be changed.

Info

Note that the sample Jenkins pipelines supplied with MettleCI do not make use of parameters as all pipeline values are supplied as Node Properties.

Environment Section

The environment section allows you to derive and set further useful variables. The template pipeline composes the suffixed DataStage project and copies one of the parameters to an all uppercased variable for convenience later.

Code Block
    environment {
        DATASTAGE_PROJECT label 'mettleci:datastage11.7.1'
    }

Parameters section

The parameters section enumerates the parameters to the pipeline with their default values. The four parameters used in the sample, and their meanings are discussed in Deploying a Jenkins Pipeline

Info

note that parameters are immutable, and once set at pipeline (or scope section) start, cannot be changed.

Code Block
    parameters {
        string(name: 'domainName', defaultValue: 'demo4-svcs.datamigrators.io:59445', description: 'DataStage Service Tier')
        string(name: 'serverName', defaultValue: 'demo4-engn.datamigrators.io', description: 'DataStage Engine Tier')
        string(name: 'projectName', defaultValue: 'wwi_jenkins_ds117', description: 'Logical (unsuffixed) Project Name')
        string(name: 'environmentId', defaultValue: 'ci', description: 'Environment Identifer')
    }

When run interactively, the user will be prompted for these at pipeline start time like so:

...

Environment Section

The environment section allows you to derive and set further useful variables. The sample pipeline composes the suffixed DataStage project and copies one of the parameters to an all uppercased variable for convenience later.

Code Block
    environment {
        DATASTAGE_PROJECT = "${params.projectName}_${params.environmentId}"
        ENVIRONMENT_ID = "${params.environmentId}"
    }

These will be visible in the execution environment as environment variables, and thus will be available to batch files, external commands, and the like without needing to explicitly pass them.

Stages Section

The stages section is a container for an arbitrary number of individual stages, which are normally run in sequence. In the sample pipeline there are three, “Deploy”, “Test”, and “Promote”

Stage section notes

Each stage section begins with a label, which is a quoted arbitrary text string displayed on the pipeline diagram in Blue Ocean when viewing the pipeline,

Within the stage are two major subsections, steps{} and post{}, which contain the actions to be performed during normal execution, and the actions to be performed at the end of the stage

Use of the parallel { } construct allows nesting of stages within stages.

Deploy stage

This is the first stage in the pipeline and it is tasked with deploying changed project assets to the CI (continuous integration) project. WithCredentialsprovides appropriate credentials (using the credentials plugin) to the rest of the steps. change from mci-user if you are using a different user. Each bat step invokes the MettleCI CLI to perform one of the deployment tasks. Refer to the MettleCI CLI Command Reference for details of what each step is doing. The label, chosen to be descriptive of the step task, becomes the step text in the Blue Ocean display of the pipeline. If a step aborts that ends the steps part of the stage. After the steps conclude, by error or by reaching the end, the post section is evaluated. Although there are other values such as changed, fixed, etc, using always ensures the section is always executed to process test results (here, the result of the compilations) and then cleanup.

Code Block
 stage("Deploy") {= "${params.projectName}_${params.environmentId}"
        ENVIRONMENT_ID = "${params.environmentId}"
    }

These will be visible in the execution environment as environment variables, and thus will be available to batch files, external commands, and the like without needing to explicitly pass them.

Environment variables can be further manipulated in the same or subsequent environment {} section(s)

Stages Section

The Stages section (see terminology) is a container for an arbitrary number of individual Stage containers, which

Contains Steps - the executable actions

which are normally run in sequence (but can be configure to run in parallel).

Stage section notes

Each stage section begins with a label, which is a quoted arbitrary text string displayed on the pipeline diagram in Blue Ocean when viewing the pipeline,

Within the stage are two major subsections, steps{} and post{}, which contain the actions to be performed during normal execution, and the actions to be performed at the end of the stage

Use of the parallel { } construct allows nesting of stages within stages.

Deploy stage

This is the first stage in the pipeline and it is tasked with deploying changed project assets to the CI (continuous integration) project. WithCredentialsprovides appropriate credentials (using the credentials plugin) to the rest of the steps. change from mci-user if you are using a different user. Each bat step invokes the MettleCI CLI to perform one of the deployment tasks. Refer to the MettleCI - Auto-Generated CLI Syntax Reference for details of what each step is doing. The label, chosen to be descriptive of the step task, becomes the step text in the Blue Ocean display of the pipeline. If a step aborts that ends the steps part of the stage. After the steps conclude, by error or by reaching the end, the post section is evaluated. Although there are other values such as changed, fixed, etc, using always ensures the section is always executed to process test results (here, the result of the compilations) and then cleanup.

Code Block
stages {
    stage("Deploy") {
            steps {
                withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {

                    bat label: 'Create DataStage Project', script: "${env.METTLE_SHELL} datastage create-project -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}"                
					bat label: 'Substitute parameters in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -filePattern \"*.apt\" -properties var.${params.environmentId} -outDir config"
                    bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.DATASTAGE_PROJECT}\""
              steps {     bat label: 'Deploy DataStage config and file system     withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
assets', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""                    
bat label: 'Create DataStage Project', script: "               bat label: 'Deploy DataStage project', script: "${env.METTLE_SHELL} datastage create-projectdeploy -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}" -assets datastage -parameter-sets \"config\\Parameter Sets\" -threads 8          -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

   					bat label: 'Substitute parameters in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -filePattern \"*.apt\" -properties var.${params.environmentId} -outDir config"             } // end of withCredentials section
            } // end of steps in this stage
            post {
       bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.DATASTAGE_PROJECT}\"" always {
                    junit bat labeltestResults: 'Deploy DataStage config and file system assetslog/**/mettleci_compilation.xml', scriptallowEmptyResults: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""true
                    withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                        bat label: 'DeployCleanup DataStagetemporary projectfiles', script: "${env.METTLE_SHELL} datastageremote deployexecute -domainhost ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -assets datastage -parameter-sets script \"config\\Parameter Setscleanup.sh\" -threads"
8 -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""                  } // end of things to "always" perform

    }            } post// {end of post steps section
            always} {// end of deploy stage
                junit testResults: 'log/**/mettleci_compilation.xml', allowEmptyResults: true
 /*                   withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')])

Test stage and parallel Static Analysis and Unit Tests substages

After a successful deployment the pipeline will perform some tests, compliance and the unit tests. These can be performed in parallel as there is no interdependence, so we use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI - Auto-Generated CLI Syntax Reference for details of what each step is doing.

Compliance testing is relatively simple, requiring only a single step to run the rules with no cleanup, and subsequent collection of the results. Unit tests are a bit more involved with upload of the specs, execution of the jobs in test mode, and downloading the reports followed again by collection of results

Code Block
    stage("Test") {
            parallel {
          bat label: 'Cleanup temporary files', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\cleanup.sh\"" stage('Static Analysis') {
                    
    }                steps {
   deleteDir()  */                 }  bat label: 'Perform static analysis', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report.xml\" -junit -rules compliance\\WORKBENCH -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""
         }

Test stage and parallel Static Analysis and Unit Tests substages

After a successful deployment the pipeline will perform some tests, compliance and the unit tests. These can be performed in parallel as there is no interdependence, so we use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing.

Compliance testing is relatively simple, requiring only a single step to run the rules with no cleanup, and subsequent collection of the results. Unit tests are a bit more involved with upload of the specs, execution of the jobs in test mode, and downloading the reports followed again by collection of results

Code Block
stage("Test") {           } // end of Static Analysis steps
                    post {
                        always {
                            junit testResults: 'compliance_report.xml', allowEmptyResults: true                             parallel
{                 stage('Static Analysis') {     }
                    } // end of Static Analysis post steps actions
       steps {        } // end of Static Analysis (sub)stage
          bat label:   'Perform static analysisstage(', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report.xml\" -junit -rules compliance\\WORKBENCH -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""Unit Tests') {
                    
                    steps {
}                     post {  withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable:   'datastageUsername')]) {
               always {            bat label: 'Upload unit test specs',            junit testResults: 'compliance_report.xml', allowEmptyResults: true
            script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"unittest\" -transferPattern \"**/*\" -destination \"/opt/dm/mci/specs/${env.DATASTAGE_PROJECT}\""
               deleteDir()             bat label: 'Execute unit tests for changed DataStage jobs',     script: "${env.METTLE_SHELL} unittest test -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -specs unittest -reports test-reports     }-project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""
                }            bat label: 'Retrieve unit test stage(results'Unit Tests') {
             , script: "${env.METTLE_SHELL} remote download -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"/opt/dm/mci/reports\" -transferPattern \"${env.DATASTAGE_PROJECT}/**/*.xml\" -destination \"test-reports\""
                        } 
 steps {                  }  // end of Unit  withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {Tests steps
                    post {
            bat label: 'Upload unit test specs', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"unittest\" -transferPattern \"**/*\" -destination \"/opt/dm/mci/specs/${env.DATASTAGE_PROJECT}\"" always {
                            junit testResults: 'test-reports/**/*.xml', allowEmptyResults: true     bat label: 'Execute unit tests for changed DataStage jobs', script: "${env.METTLE_SHELL} unittest test -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -specs unittest -reports test-reports -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""                 
                        }
             bat label: 'Retrieve unit test results', script: "${env.METTLE_SHELL} remote// downloadend -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"/opt/dm/mci/reports\" -transferPattern \"${env.DATASTAGE_PROJECT}/**/*.xml\" -destination \"test-reports\""of Unit Test post steps actions
                } end of Unit Tests (sub)stage
       }     } // end of parallel stage container construct
        }  // end of                 post {
                        always Test (super)stage

Some notes:
The template assumes the compliance rules are in the same repo. If one wishes to source the rules from a different repo and further, to run separate warn/fail sets of tests, presenting the result in one report, the static analysis stage would look like this instead: (refer to Jenkins documentation to understand the parameters in the checkout plugin invocation)

Code Block
stage('Static Analysis') {                    
        junit testResults: 'test-reports/**/*.xml', allowEmptyResults: true         steps {

						checkout([  
						$class: 'GitSCM', 
						branches: [[name: 'refs/heads/master']], 
						doGenerateSubmoduleConfigurations: false, 
						extensions: [[$class: 'RelativeTargetDirectory',  deleteDir() 
          relativeTargetDir: 'compliance2']], 
						submoduleCfg: [], 
						userRemoteConfigs: [[ credentialsId: 'lar_at_demo_mettleci_i0', url: 'http://lar@demo.mettleci.io/bitbucket/scm/cr/compliance-rules.git' ]]
						])

            }            bat label: 'Perform static analysis for non fatal violations', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report_warn.xml\" -junit -test-suite \"warnings\" -ignore-test-failures  -rules compliance2\\WARN    }
-project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""
                    }    bat label: 'Perform static analysis for fatal violations', script: }

Promotion stage with three parallel deployment substages

The sample pipeline has three promotion deployments, each of which is essentially the same as the initial deploy but directed at a different project. Our sample assumes that all three deployments are to the same DataStage server, but the parameter blocks are at the individual promotion level so you can change these as necessary. You can reduce or increase (by deleting or copying code blocks) the number as necessary. These can be performed in parallel as there is no interdependence, so we again use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing. The “innards” of each promotion step are elided in the following code snippet to focus on the structure. the input{} construct is used to gather parameters and get approval from the user for the non automatic steps.

...

"${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report_fail.xml\" -junit -test-suite \"failures\" -rules compliance2\\FAIL -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                    }
                    post {
                        always {
                            junit testResults: 'compliance_report_*.xml', allowEmptyResults: true
                        }
                    }
                }

Promotion stage with three parallel deployment substages

The template pipeline has three promotion deployments, each of which is essentially the same as the initial deploy but directed at a different project. Our template assumes that all three deployments are to the same DataStage server, but the parameter blocks are at the individual promotion level so you can change these as necessary. You can reduce or increase (by deleting or copying code blocks) the number as necessary. These can be performed in parallel as there is no interdependence, so we again use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI - Auto-Generated CLI Syntax Reference for details of what each step is doing. The “innards” of each promotion step, save the first are elided in the following code snippet to focus on the structure.

The input{} construct is used to gather parameters and get approval from the user for the non automatic steps. In the template, all three are automatic.

As in the test stage, we take advantage of the ability to nest stages using the parallel{} construct. The second and third nested stage are similar

Code Block
 stage("Promote") {
            parallel {
                stage('1. Quality Assurance') {
                    
                    input {
                        message "Should we promote to QA?"
                        ok "Yes"
                        parameters {
                            string(name: 'domainName', defaultValue: 'test1-svcs.datamigrators.io:59445', description: 'DataStage Service Tier')
                            string(name: 'serverName', defaultValue: 'TEST1-ENGN.DATAMIGRATORS.IO', description: 'DataStage Engine Tier')
                            string(name: 'projectName', defaultValue: 'jenkins', description: 'Logical Project Name')
                            string(name: 'environmentId', defaultValue: 'qa', description: 'Environment Identifer')
                        }
                    }
                    environment {
                        DATASTAGE_PROJECT = "${projectName}_${environmentId}"
                        ENVIRONMENT_ID = "${environmentId}"
                    }
                    steps {

                        withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {

                            bat label: 'Create DataStage Project', script: "${env.METTLE_SHELL} datastage create-project -domain ${domainName} -server ${serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}"

                            bat label: 'Substitute parameteres in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -properties var.${environmentId} -outDir config"
                            bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.BUILD_TAG}\""
                            bat label: 'Deploy DataStage config and file system assets', script: "${env.METTLE_SHELL} remote execute -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""

                            bat label: 'Deploy DataStage project', script: "${env.METTLE_SHELL} datastage deploy -domain ${domainName} -server ${serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -assets datastage -parameter-sets \"config\\Parameter Sets\" -threads 8 -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                        }
                    }
                    post {
                        always {
                            withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                                bat label: 'Cleanup temporary files', script: "${env.METTLE_SHELL} remote execute -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\cleanup.sh\""
                            }                          
                        }
                    }
                }
                stage('2. Performance') {
                  // details omitted, see first stage
                }  
                stage('3. Production') {
                  // details omitted, see first stage
                } 
            }  // end of parallel{} stage container construct
        }  // end of "promote" stage
    } // end of stages in pipeline
// after all stages have finished do cleanup
	post {      
		cleanup {
			deleteDir()
        }
	}
}
                    

If we had wished to have the first (“testing”) stage run without prompting the user, we would change it to look like as follows (steps omitted for brevity)

Code Block
   stage('1. Testing') {
                        // input{} section omitted entirely
		            environment {
                          // since no user to set things, we must set them up ourselves
                          // assume same server and project as set globally, 
                          // but the environment ID (project suffix) varies
 				        environmentId = 'qa'
                          // hard coded environmentId, DATASTAGE_PROJECT calculated as before
                        DATASTAGE_PROJECT = "${projectName}_${environmentId}"
                    }
                    steps {
                      // omitted for brevity, identical to above                             
                    }
                    post {
                      // omitted for brevity, identical to above 
                    }
                }

Many other variations are possible. Consult with your Jenkins resources for more nuances.