Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The parameters section enumerates the parameters to the pipeline with their default values. The four parameters used in the sampletemplate, and their meanings are discussed in Deploying a Jenkins Pipeline

...

The environment section allows you to derive and set further useful variables. The sample template pipeline composes the suffixed DataStage project and copies one of the parameters to an all uppercased variable for convenience later.

...

The stages section is a container for an arbitrary number of individual stages, which are normally run in sequence. In the sample template pipeline there are three, “Deploy”, “Test”, and “Promote”

...

Code Block
 stage("Deploy") {

            steps {
                withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {

                    bat label: 'Create DataStage Project', script: "${env.METTLE_SHELL} datastage create-project -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}"
                    
					bat label: 'Substitute parameters in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -filePattern \"*.apt\" -properties var.${params.environmentId} -outDir config"

                    bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.DATASTAGE_PROJECT}\""

                    bat label: 'Deploy DataStage config and file system assets', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""
                    
                    bat label: 'Deploy DataStage project', script: "${env.METTLE_SHELL} datastage deploy -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -assets datastage -parameter-sets \"config\\Parameter Sets\" -threads 8 -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                }
            }
            post {
                always {
                    junit testResults: 'log/**/mettleci_compilation.xml', allowEmptyResults: true
 /*                   withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                        bat label: 'Cleanup temporary files', script: "${env.METTLE_SHELL} remote execute -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\cleanup.sh\""
                    }
             

     deleteDir()  */                 }
            } 
        }

Test stage and parallel Static Analysis and Unit Tests substages

...

Code Block
stage("Test") {
            parallel {
                stage('Static Analysis') {
                    
                    steps {
                        bat label: 'Perform static analysis', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report.xml\" -junit -rules compliance\\WORKBENCH -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""
                    }
                    post {
                        always {
                            junit testResults: 'compliance_report.xml', allowEmptyResults: true
                            deleteDir() 
                        }
                    }
                }
                stage('Unit Tests') {
                    
                    steps {
                        withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                            bat label: 'Upload unit test specs', script: "${env.METTLE_SHELL} remote upload -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"unittest\" -transferPattern \"**/*\" -destination \"/opt/dm/mci/specs/${env.DATASTAGE_PROJECT}\""

                            bat label: 'Execute unit tests for changed DataStage jobs', script: "${env.METTLE_SHELL} unittest test -domain ${params.domainName} -server ${params.serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -specs unittest -reports test-reports -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                            bat label: 'Retrieve unit test results', script: "${env.METTLE_SHELL} remote download -host ${params.serverName} -username ${datastageUsername} -password ${datastagePassword} -source \"/opt/dm/mci/reports\" -transferPattern \"${env.DATASTAGE_PROJECT}/**/*.xml\" -destination \"test-reports\""
                        }
                    }
                    post {
                        always {
                            junit testResults: 'test-reports/**/*.xml', allowEmptyResults: true                             deleteDir() 
                        }
                    }
                }
            }
        }

Promotion stage with three parallel deployment substages

The sample pipeline has three promotion deployments, each of which is essentially the same as the initial deploy but directed at a different project. Our sample assumes that all three deployments are to the same DataStage server, but the parameter blocks are at the individual promotion level so you can change these as necessary. You can reduce or increase (by deleting or copying code blocks) the number as necessary. These can be performed in parallel as there is no interdependence, so we again use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing. The “innards” of each promotion step are elided in the following code snippet to focus on the structure. the input{} construct is used to gather parameters and get approval from the user for the non automatic steps.

<fix - talk about how we achieve one running without prompting once code available>
Code goes hereSome notes:
The template assumes the compliance rules are in the same repo. If one wishes to source the rules from a different repo and further, to run separate warn/fail sets of tests, presenting the result in one report, the static analysis stage would look like this instead: (refer to Jenkins documentation to understand the parameters in the checkout plugin invocation)

Code Block
stage('Static Analysis') {                    
                    steps {

						checkout([  
						$class: 'GitSCM', 
						branches: [[name: 'refs/heads/master']], 
						doGenerateSubmoduleConfigurations: false, 
						extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'compliance2']], 
						submoduleCfg: [], 
						userRemoteConfigs: [[ credentialsId: 'lar_at_demo_mettleci_i0', url: 'http://lar@demo.mettleci.io/bitbucket/scm/cr/compliance-rules.git' ]]
						])

                        bat label: 'Perform static analysis for non fatal violations', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report_warn.xml\" -junit -test-suite \"warnings\" -ignore-test-failures  -rules compliance2\\WARN -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                        bat label: 'Perform static analysis for fatal violations', script: "${env.METTLE_SHELL} compliance test -assets datastage -report \"compliance_report_fail.xml\" -junit -test-suite \"failures\" -rules compliance2\\FAIL -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                    }
                    post {
                        always {
                            junit testResults: 'compliance_report_*.xml', allowEmptyResults: true
                        }
                    }
                }

Promotion stage with two/three parallel deployment substages

The template pipeline has two/three promotion deployments, each of which is essentially the same as the initial deploy but directed at a different project. Our template assumes that all three deployments are to the same DataStage server, but the parameter blocks are at the individual promotion level so you can change these as necessary. You can reduce or increase (by deleting or copying code blocks) the number as necessary. These can be performed in parallel as there is no interdependence, so we again use the parallel{} construct to nest the stage invocations. As before, refer to the MettleCI CLI Command Reference for details of what each step is doing. The “innards” of each promotion step, save the first are elided in the following code snippet to focus on the structure.

The input{} construct is used to gather parameters and get approval from the user for the non automatic steps. In the template, all three are automatic.

As in the test stage, we take advantage of the ability to nest stages using the parallel{} construct. The second and third nested stage are similar

Code Block
 stage("Promote") {
            parallel {
                stage('1. Quality Assurance') {
                    
                    input {
                        message "Should we promote to QA?"
                        ok "Yes"
                        parameters {
                            string(name: 'domainName', defaultValue: 'test1-svcs.datamigrators.io:59445', description: 'DataStage Service Tier')
                            string(name: 'serverName', defaultValue: 'TEST1-ENGN.DATAMIGRATORS.IO', description: 'DataStage Engine Tier')
                            string(name: 'projectName', defaultValue: 'jenkins', description: 'Logical Project Name')
                            string(name: 'environmentId', defaultValue: 'qa', description: 'Environment Identifer')
                        }
                    }
                    environment {
                        DATASTAGE_PROJECT = "${projectName}_${environmentId}"
                        ENVIRONMENT_ID = "${environmentId}"
                    }
                    steps {

                        withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {

                            bat label: 'Create DataStage Project', script: "${env.METTLE_SHELL} datastage create-project -domain ${domainName} -server ${serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword}"

                            bat label: 'Substitute parameteres in DataStage config', script: "${env.METTLE_SHELL} properties config -baseDir datastage -filePattern \"*.sh\" -filePattern \"DSParams\" -filePattern \"Parameter Sets/*/*\" -properties var.${environmentId} -outDir config"
                            bat label: 'Transfer DataStage config and filesystem assets', script: "${env.METTLE_SHELL} remote upload -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -transferPattern \"filesystem/**/*,config/*\" -destination \"${env.BUILD_TAG}\""
                            bat label: 'Deploy DataStage config and file system assets', script: "${env.METTLE_SHELL} remote execute -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\deploy.sh\""

                            bat label: 'Deploy DataStage project', script: "${env.METTLE_SHELL} datastage deploy -domain ${domainName} -server ${serverName} -project ${env.DATASTAGE_PROJECT} -username ${datastageUsername} -password ${datastagePassword} -assets datastage -parameter-sets \"config\\Parameter Sets\" -threads 8 -project-cache \"C:\\dm\\mci\\cache\\${params.serverName}\\${env.DATASTAGE_PROJECT}\""

                        }
                    }
                    post {
                        always {
                            withCredentials([usernamePassword(credentialsId: 'mci-user', passwordVariable: 'datastagePassword', usernameVariable: 'datastageUsername')]) {
                                bat label: 'Cleanup temporary files', script: "${env.METTLE_SHELL} remote execute -host ${serverName} -username ${datastageUsername} -password ${datastagePassword} -script \"config\\cleanup.sh\""
                            }                          
                        }
                    }
                }
                stage('2. Performance') {
                  // details omitted, see first stage
                }  
                stage('3. Production') {
                  // details omitted, see first stage
                } 
            }  // end of parallel{} stage container construct
        }  // end of "promote" stage
    } // end of stages in pipeline
// after all stages have finished do cleanup
	post {      
		cleanup {
			deleteDir()
        }
	}
}
                    

If we had wished to have the first (“testing”) stage run without prompting the user, we would change it to look like as follows (steps omitted for brevity)

Code Block
   stage('1. Testing') {
                        // input{} section omitted entirely
		            environment {
                          // since no user to set things, we must set them up ourselves
                          // assume same server and project as set globally, 
                          // but the environment ID (project suffix) varies
 				        environmentId = 'qa'
                          // hard coded environmentId, DATASTAGE_PROJECT calculated as before
                        DATASTAGE_PROJECT = "${projectName}_${environmentId}"
                    }
                    steps {
                      // omitted for brevity, identical to above                             
                    }
                    post {
                      // omitted for brevity, identical to above 
                    }
                }



Many other variations are possible. Consult with your Jenkins resources for more nuances.