Jenkins Pipeline Syntax
External
- https://jenkins.io/doc/book/pipeline/syntax/
- https://jenkins.io/doc/pipeline/steps/
- https://jenkins.io/doc/pipeline/steps/core/
Internal
Scripted Pipeline
Scripted Pipeline is classical way of declaring Jenkins Pipeline, preceding Declarative Pipeline. Unlike the Declarative Pipeline, the Scripted Pipeline is a general-purpose DSL built with Groovy. The pipelines are declared in Jenkinsfiles and executed from the top of the Jenkinsfile downwards, like most traditional scripts in Groovy. Groovy syntax is available directly in the Scripted Pipeline declaration. The flow control can be declared with if
/else
conditionals or via Groovy's exception handling support with try
/catch
/finally
.
The simplest pipeline declaration:
echo 'pipeline started'
A more complex one:
node('some-worker-label') {
echo 'Pipeline logic starts'
stage('Build') {
if (env.BRANCH_NAME == 'master') {
echo 'this is only executed on master'
}
else {
echo 'this is executed elsewhere'
}
}
stage('Test') {
// ...
}
stage('Deploy') {
// ...
}
stage('Example') {
try {
sh 'exit 1'
}
catch(ex) {
echo 'something failed'
throw
}
}
}
The basic building block of the Scripted Pipeline syntax is the step. The Scripted Pipeline does not introduce any steps that are specific to its syntax. The generic pipeline steps, such as node, stage, parallel, etc. are available here: Pipeline Steps.
Scripted Pipeline at Runtime
When the Jenkins server starts to execute the pipeline, it pulls the Jenkinsfile either from a repository, following a checkout sequence similar to the one shown here, or from the pipeline configuration, if it is specified in-line. Then the Jenkins instance instantiates a WorkflowScript (org.jenkinsci.plugins.workflow.cps.CpsScript.java) instance. The "script" instance can be used to access the following state elements:
- pipeline parameters, with
this.params
, which is a Map.
Declarative Pipeline
Declarative Pipeline is a new way of declaring Jenkins pipelines, and consists in a more simplified and opinionated syntax. Declarative Pipeline is an alternative to Scripted Pipeline.
pipeline {
agent any
options {
skipStagesAfterUnstable()
}
stages {
stage('Build') {
steps {
sh 'make'
}
}
stage('Test'){
steps {
sh 'make check'
junit 'reports/**/*.xml'
}
}
stage('Deploy') {
steps {
sh 'make publish'
}
}
}
}
Declarative Pipeline Directives
environment
See:
parameters
See:
Parameters
Environment Variables
Pipeline Steps
node
Allocates an executor or a node, typically a worker, and runs the enclosed code in the context of the workspace of that worker. Node may take a label name, computer name or an expression. The labels are declared on workers when they are defined in the master configuration, in their respective "clouds".
String NODE_LABEL = 'infra-worker'
node(NODE_LABEL) {
sh 'uname -a'
}
stage
Creates a labeled block.
parallel
Takes a map from branch names to closures and an optional argument failFast
, and executes the closure code in parallel.
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
stage("tests") {
parallel(
"unit tests": {
// run unit tests
},
"coverage tests": {
// run coverage tests
}
)
}
Allocation to different nodes can be performed inside the closure:
def tasks = [:]
tasks["branch-1"] = {
stage("task-1") {
node('node_1') {
sh 'echo $NODE_NAME'
}
}
}
tasks["branch-2"] = {
stage("task-2") {
node('node_1') {
sh 'echo $NODE_NAME'
}
}
}
parallel tasks
sh
Shell Script. It can be specified in-line or it can refer to a file available on the filesystem exposed to the Jenkins node. It needs to be enclosed by a node to work.
The metacharacter $ must be escaped: \${LOGDIR}
, unless it refers to a variable form the Groovy context.
Example:
stage.sh """
LOGDIR=${fileName}-logs
mkdir -p \${LOGDIR}/something
""".stripIndent()
Both """..."""
and '''...'''
Groovy constructs can be used. For more details on enclosing representing multi-line strings with """
or '''
, see
Script return status. By default, a script exits with a non-zero return code will cause the step to fail with an exception. To prevent that, configure returnStatus
to true, and the step will return the exit value of the script, instead of failing on non-zero exit value. You may then compare to zero.
Script stdout. By default, the standard output of the script is send to the log. If returnStdout
is set to true, the script standard output is returned as String as the step value. Call trim()
to strip off the trailing newline. The script's stderr is always sent to the log.
String result = sh(returnStdout: true, script: './bin/do-something').trim()
ws
Allocate workspace.
build
This is how a main pipeline launches in execution a subordinate pipeline.
This is how we may be able to return the result: https://support.cloudbees.com/hc/en-us/articles/218554077-How-to-set-current-build-result-in-Pipeline
junit
Jenkins understands the JUnit test report XML format (which is also used by TestNG). To use this feature, set up the build to run tests, which will generate their test reports into a local agent directory, then specify the path to the test reports in Ant glob syntax to the JUnit plugin pipeline step junit
stage.junit **/target/*-report/TEST-*.xml
Jenkins uses this step to ingest the test results, process them and provide historical test result trends, a web UI for viewing test reports, tracking failures, etc.
Basic Steps
These basic steps are used invoking on stage.
. In a Jenkinsfile, and inside a stage, invoke on this.
or simply invoking directly, without qualifying.
dir
Change current directory.
pwd
Return the current directory path as a string.
echo
echo "pod memory limit: ${params.POD_MEMORY_LIMIT_Gi}"
echo """
Run Configuration:
something: ${SOMETHING}
something else: ${SOMETHING_ELSE}
"""
error
This step signals an error and fails the pipeline. Alternatively, you can simply:
throw new Exception("some message")
readFile
Read a file from the workspace.
def versionFile = readFile("${stage.WORKSPACE}/terraform/my-module/VERSION")
stash
input
In its basic form, renders a "Proceed"/"Abort" input box with a custom message. Selecting "Proceed" passes the control to the next step in the pipeline. Selecting "Abort" throws a org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
, which produces "gray" pipelines.
input(
id: 'Proceed1',
message: 'If the manual test is successful, select \'Proceed\'. Otherwise, you can abort the pipeline.'
)
timeout
Upon timeout, an org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
is thrown from the closure that is being executed, and not from the timeout() invocation. The code shown below prints "A", "B", "D":
timeout(time: 5, unit: 'SECONDS') {
echo "A"
try {
echo "B"
doSometing(); // this step takes a very long time and will time out
echo "C"
}
catch(org.jenkinsci.plugins.workflow.steps.FlowInterruptedException e) {
// if this exception propagates up without being caught, the pipeline gets aborted
echo "D"
}
}
writeFile
withEnv
Core
archiveArtifacts
Archives the build artifacts (for example, distribution zip files or jar files) so that they can be downloaded later. Archived files will be accessible from the Jenkins webpage. Normally, Jenkins keeps artifacts for a build as long as a build log itself is kept. Note that the Maven job type automatically archives any produced Maven artifacts. Any artifacts configured here will be archived on top of that. Automatic artifact archiving can be disabled under the advanced Maven options.
fingerprint
Obtaining the Current Pipeline Build Number
def buildNumber = currentBuild.rawBuild.getNumber()
FlowInterruptedException
throw new FlowInterruptedException(Result.ABORTED)
String branch="..."
String projectName = JOB_NAME.substring(0, JOB_NAME.size() - JOB_BASE_NAME.size() - 1)
WorkflowMultiBranchProject project = Jenkins.instance.getItemByFullName("${projectName}")
if (project == null) {
...
}
WorkflowJob job = project.getBranch(branch)
if (job == null) {
...
}
WorkflowRun run = job.getLastSuccessfulBuild()
if (run == null) {
...
}
List<Run.Artifact> artifacts = run.getArtifacts()
...
Passing an Environment Variable from Downstream Build to Upstream Build
Upstream build:
...
def result = build(job: jobName, parameters: params, quietPeriod: 0, propagate: true, wait: true);
result.getBuildVariables()["SOME_VAR"]
...
Downstream build:
env.SOME_VAR = "something"
@NonCPS
Build Summary
//
// write /tmp/summary-section-1.html
//
def summarySection1 = util.catFile('/tmp/summary-section-1.html')
if (summarySection1) {
def summary = manager.createSummary('document.png')
summary.appendText(summarySection1, false)
}
//
// write /tmp/summary-section-2.html
//
def summarySection2 = util.catFile('/tmp/summary-section-2.html')
if (summarySection2) {
def summary = manager.createSummary('document.png')
summary.appendText(summarySection2, false)
}