Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The Jenkins DevOps pipeline is defined in a “Jenkinsfile” (by default named Jenkinsfile ) and has the following structure (most detail omitted). Paired curly braces{ } denote the start and end of a scoped section.

Code Block
pipeline {                    /* outermost scope                    */
    agent { ... }             /* where will this run?               */ 
    parameters { ... }        /* What is it being fed?              */
    environment { ... }       /* what other variables are needed?   */
    stages {                  /* operational container              */
          stage { ... }       /* first sequence of steps            */
          stage { ... }       /* next                               */
          ...
    }                         /* end of stages                      */
}

To understand these sections in more detail, refer to your Jenkins documentation. We present only the MettleCI specific things you need to know. Note that Jenkins offers flexibility in where some of these sections are placed, and in some cases, allows repetition at different scopes. (The {} are scope delimiters.) Note also that there are additional things you could put in to just about every section, and that some sections may be repeated at a given scope level.

...

The stages section is a container for an arbitrary number of individual stages, which are normally run in sequence. In the sample pipeline there are three, “Deploy”, “Test”, and “Promote”

Stage

...

section notes

Each stage section begins with a label, which is a quoted arbitrary text string displayed on the pipeline diagram in Blue Ocean when viewing the pipeline,

...

Use of the parallel { } construct allows nesting of stages within stages.

Deploy

...

stage

This is the first stage in the pipeline and it is tasked with deploying changed project assets to the CI (continuous integration) project. WithCredentialsprovides appropriate credentials (using the credentials plugin) to the rest of the steps. change from mci-user if you are using a different user. Each bat step invokes the MettleCI CLI to perform one of the deployment tasks. Refer to the MettleCI CLI Command Reference for details of what each step is doing. The label, chosen to be descriptive of the step task, becomes the step text in the Blue Ocean display of the pipeline. If a step aborts that ends the steps part of the stage. After the steps conclude, by error or by reaching the end, the post section is evaluated. Although there are other values such as changed, fixed, etc, using always ensures the section is always executed to process test results (here, the result of the compilations) and then cleanup.

Code Block
 stage("Deploy") {

            steps {
                withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {

                    bat label: 'Create DataStage Project', script: "${env.METTLE_SHELL} datastage create-project -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}"
                    
					bat label: 'Substitute parameters in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -filePattern \"*.apt\" -properties var.${params.environmentId} -outDir config"

                    bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.DATASTAGE_PROJECT}\""

                    bat label: 'Deploy DataStage config and file system assets', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""
                    
                    bat label: 'Deploy DataStage project', script: "${env.METTLE_SHELL} datastage deploy -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -assets datastage -parameter-sets \"config\\Parameter Sets\" -threads 8 -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                }
            }
            post {
                always {
                    junit testResults: 'log/**/mettleci_compilation.xml', allowEmptyResults: true
 /*                   withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                        bat label: 'Cleanup temporary files', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\cleanup.sh\""
                    }
                    deleteDir()  */
                }
            } 
        }

Test stage and parallel Static Analysis and Unit Tests substages

After a successful deployment the pipeline will perform some tests, compliance and the unit tests. These can be performed in parallel as there is no interdependence, so we use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing.

Compliance testing is relatively simple, requiring only a single step to run the rules with no cleanup, and subsequent collection of the results. Unit tests are a bit more involved with upload of the specs, execution of the jobs in test mode, and downloading the reports followed again by collection of results

Code Block
stage("Test") {
            parallel {
                stage('Static Analysis') {
                    
                    steps {
                        bat label: 'Perform static analysis', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report.xml\" -junit -rules compliance\\WORKBENCH -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""
                    }
                    post {
                        always {
                            junit testResults: 'compliance_report.xml', allowEmptyResults: true
                            deleteDir() 
                        }
                    }
                }
                stage('Unit Tests') {
                    
                    steps {
                        withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                            bat label: 'Upload unit test specs', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"unittest\" -transferPattern \"**/*\" -destination \"/opt/dm/mci/specs/${env.DATASTAGE_PROJECT}\""

                            bat label: 'Execute unit tests for changed DataStage jobs', script: "${env.METTLE_SHELL} unittest test -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -specs unittest -reports test-reports -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                            bat label: 'Retrieve unit test results', script: "${env.METTLE_SHELL} remote download -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"/opt/dm/mci/reports\" -transferPattern \"${env.DATASTAGE_PROJECT}/**/*.xml\" -destination \"test-reports\""
                        }
                    }
                    post {
                        always {
                            junit testResults: 'test-reports/**/*.xml', allowEmptyResults: true
                            deleteDir() 
                        }
                    }
                }
            }
        }

Promotion stage with three parallel deployment substages

The sample pipeline has three promotion deployments, each of which is essentially the same as the initial deploy but directed at a different project. Our sample assumes that all three deployments are to the same DataStage server, but the parameter blocks are at the individual promotion level so you can change these as necessary. You can reduce or increase (by deleting or copying code blocks) the number as necessary. These can be performed in parallel as there is no interdependence, so we again use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing. The “innards” of each promotion step are elided in the following code snippet to focus on the structure. the input{} construct is used to gather parameters and get approval from the user for the non automatic steps.

<fix - talk about how we achieve one running without prompting once code available>

Code goes here