Background: The system of our company is based on distribution. With the increase of business, the number of services expands quickly, which leads to the doubling of on-line workload. Therefore, we thought of automatic deployment, and Jenkins just meets our requirements of automatic deployment in distributed mode.
A, Jenkins
Jenkins is what?
-
Jenkins is an open source CI&CD software for automating a variety of tasks, including building, testing, and deploying software.
-
Jenkins supports a variety of operations, either through system packages, Docker, or through a standalone Java program.
What is CI/CD?
Continuous integration
The goal of modern application development is to have multiple developers working on different features of the same application simultaneously. However, if an enterprise schedules the merging of all branch source code in one day (called “merge day”), it can end up being cumbersome, time-consuming, and manual. This is because when a single developer working independently makes changes to an application, it is possible to conflict with changes made at the same time by other developers. If each developer customizes his or her own local integrated development environment (IDE), rather than having the team agree on a cloud-based IDE, the problem is exacerbated.
Continuous integration (CI) helps developers merge code changes into a shared branch or “trunk” more frequently (sometimes even daily). Once the changes made by developers to the application are merged, the system verifies the changes by automatically building the application and running different levels of automated testing (usually unit and integration tests) to ensure that the changes have not broken the application. This means that the test covers everything from classes and functions to the different modules that make up the entire application. If automated tests find conflicts between new code and existing code, CI can make it easier to quickly fix those errors.
Continuous delivery
Continuous delivery automatically publishes validated code to the repository after completing the automated process of building and unit and integration tests in CI. To achieve an efficient continuous delivery process, it is important to ensure that CI is built into the development pipeline. The goal of continuous delivery is to have a code base ready to deploy to production.
In continuous delivery, each phase, from the consolidation of code changes to the delivery of production-ready builds, involves test automation and code release automation. At the end of the process, the operations team can quickly and easily deploy the application into production.
Continuous deployment
The final stage for a mature CI/CD pipeline is continuous deployment. As an extension of continuous delivery, which automatically releases production-ready builds to code repositories, continuous deployment can automatically release applications to production. Since there is no manual gating during the pipeline phase prior to production, continuous deployment relies heavily on well-designed test automation.
In practice, continuous deployment means that a developer’s changes to an application can take effect within minutes of being written (assuming it passes automated testing). This makes it easier to continuously receive and integrate user feedback. In sum, all of these CI/CD related steps help reduce the deployment risk of the application, making it easier to release changes to the application in small pieces rather than all at once. However, because automated tests need to be written to accommodate various testing and release phases in the CI/CD pipeline, the upfront investment is still significant.
What is Jenkins Pipeline?
Jenkins Pipeline (or simply “Pipeline”) is a set of plug-ins that integrate the implementation and implementation of continuous delivery into Jenkins. Continuous delivery Pipeline automation expresses the process of continuously delivering version-based software to your users and consumers. Jenkins Pipeline provides an extensible set of tools for implementing a “simple to complex” delivery process into “continuous delivery as code”. Jenkins Pipeline definitions are typically written to a text file (called Jenkinsfile) that can be put into the project’s source control repository.
2. Install Jenkins
The installation of Jenkins is very simple. Go to the official website to download the Jenkins WAR package, find a server that is ready to install Jenkins, install Tomcat in advance, copy the WAR package to the Tomcat webapps folder, and start the Tomcat server. Now you can visit Jenkins!! Example: localhost:8080/ Jenkins /
The installation method of Jenkins based on Docker is not specified here.
-
Global configuration
Enter the system configuration, see the following figure: Configure Jenkins warehouse directory
Jenkins Pipeline
We have a simple understanding of Jenkins and Jenkins Pipeline through the above two steps, and Jenkins has been successfully installed on the server, so how to deploy the project automatically on Jenkins? Here we demonstrate Jenkins pipeline method, other methods do not demonstrate.
1. New assembly line
First click the new task on the left, and you will see the interface below. The task name is mandatory and the type is Pipeline.
2. Configure the pipeline
After the assembly line is built, it needs to be configured. The main configuration is as follows:
Step 1: Select pipeline type.
Jenkinsfiles can be written using two syntax – declarative and scripted. Declarative and scripted pipelining are fundamentally different. Declarative pipelining is a closer feature of the Jenkins pipelining: it provides richer syntactic features than scripted pipelining and is designed to make it easier to write and read pipelining code.
Step 2: Software Configuration Management (SCM), select Git
Software configuration management (SCM) is the practice of implementing version control, change control procedures, and the use of appropriate configuration management software to ensure the integrity and traceability of all configuration items. Configuration management is an effective protection of work products.
Step 3: Select the address of the managed repository where JenkinsFile resides
Jenkinsfile is a text file that contains Jenkins Pipeline definitions and is checked into source control.
Step 4: Add credentials
Why we need credentials: Because our JenkinsFile is hosted on Ali Cloud, when we specify the file source as the hosting address of Ali Cloud, access control is required. Credentials can be generated in the system and then added, as shown in the following figure. Select the type of credential, SSH private key is selected here, the specific operation is to generate public key and private key in any computer, copy the public key to Ali Cloud code hosting platform, paste the private key in the following text box, click OK to add the credential is complete, then we can select the credential.
Step 5: Specify the script path
Refer to the following figure for the whole process:
Four, JenkinsFile
Now that we’ve created a pipeline task and the basic configuration is almost complete, it’s time to explain how to write JenkinsFile script files.
We created another pipeline task by following the steps above, one as a build task and the other as a deployment task, and we’ll see how the build script and deployment script are written separately.
Before we start writing the script let’s take a look at the Jenkins shared library library.
What is library?
Library is the Jenkins shared library, why the concept of shared library appeared, because when we write a lot of JenkinsFile script files, we will find a lot of repeated script code operations, such as project construction, deployment commands, and some verification logic, are very repeated, almost every project script has these codes. So Jenkins introduced the concept of shared libraries. Extract the common parts, create a new project to manage the common script code, and host the project so that we can call the shared library in the script file of each project.
How to create a library?
Find the shared library configuration in Jenkins Global configuration, as shown below:
The first step is to specify the hosting address for the shared library project;
The second step is to add credentials. (The credentials logic no longer says so.) Because shared library scripts are hosted through a hosting tool, you need to add credentials to access them.
How to write a build script
We open the JenkinsFile file used to specify the build task above, as shown below:
Step 1: Specify the library name
Step 2: Specify the address of the project to build, such as demo project
Step 3: Specify the branch name to build, such as QA
Step 4: Specify the location of the pom.xml module to build, such as demo-parent
Step 5: Specify the credentials, which have been described in detail above. In this case, it is the credentials needed to pull the Demo project. If the demo project and JenkinsFile are hosted in the same location, such as Ali Cloud, then directly use the credentials above.
Step 6: The build method here is to call the build method defined in the shared library, build method is dedicated to build, and passed the map parameter, the following is written helloWord build script, the content is customized.
def call(map) {
pipeline {
agent any
parameters {
string(name: 'BRANCH_NAME', defaultValue: "${map.BRANCH_NAME}", description: 'Select branch name')
string(name: 'POM_FOLDER', defaultValue: "${map.POM_FOLDER}", description: 'Select POM folder')
}
tools {
maven 'maven 3'} // Declare the global variable environment {REPO_URL ="${map.REPO_URL}"
BRANCH_NAME = "${BRANCH_NAME}"
}
stages {
stage('Get code') {
steps {
echo "= = = = = = = = = = = = = = = = = = = = = = = = began to pull code = = = = = = = = = = = = = = = = = = = = = = = ="
git branch: "${BRANCH_NAME}", url: "${REPO_URL}", credentialsId: "${map.PULL_KEY}"
echo "Code git address: + "${REPO_URL}"
echo "Branch name:" + "${BRANCH_NAME}"
echo "= = = = = = = = = = = = = = = = = = = = = = = = end pull code = = = = = = = = = = = = = = = = = = = = = = = ="
}
}
stage('Compile code') {
steps {
dir("${map.POM_FOLDER}") {
echo "= = = = = = = = = = = = = = = = = = = = = = = = began to compile the code = = = = = = = = = = = = = = = = = = = = = = = ="
sh 'mvn clean install -Dmaven.test.skip=true'
echo "= = = = = = = = = = = = = = = = = = = = = = = = end compiled code = = = = = = = = = = = = = = = = = = = = = = = ="
}
}
}
}
}
}
Copy the code
At this point, our build task is complete, we can go to the page and click the “Build Now” button to test whether it is successful. If it is successful, we will find that Jenkins has automatically packed the package for us.
How to write deployment scripts
Now that we have successfully completed the WAR package, how do we deploy it to the remote server? How do we write this script?
Again, we open the JenkinsFile specified above for the deployment task, as shown below:
Like the build script above, you pass in some custom parameters, except this time you call the deploy method in the shared library, which does the project deployment.
This time the deploy script is the core
Question 1: How do I kill the Tomcat process on one server on another?
Suppose we ServerA Jenkins is installed on the server, to deploy the project is on the server ServerB, general manual we will visit ServerB kill tomcat/bin/shutdown. Sh can kill a tomcat process, so how do we remote operation? In a distributed system, our projects are deployed on many different servers. The ideal state is that ServerA can remotely operate other servers and control the Tomcat life cycle. SSH is the first thing that comes to mind, so we try to configure a secret free login.
The general method is to copy the public key generated for user ServerA adminUser to the authorized_keys folder for user ServerB adminUser. ServerA successfully logged in to ServerB by accessing SSH -p port adminuser@ip.
SSH -p adminuser@ip /${SERVER_B_TOMCAT_PATH}/bin/shutdown.sh So we boldly copied this command into the Jenkins Pipeline script, which reads as follows:
stage('执行shutdown.sh命令'Steps {SSH -p port adminuser@ip /${SERVER_B_TOMCAT_PATH}/bin/shutdown.sh
}
}Copy the code
Error Host key verification failed
Thought 2: Why did two servers communicate successfully and still fail in Jenkins?
According to my guess, it should be feasible, but it still failed. I still feel that I have no permission to operate. What is the reason?
After consulting the data, we found that ALL SSH operations in Jenkins were completed by credentials, so we generated credentials again in the same way as before. Click ok to generate credentials X1, and then use the plug-in provided by Jenkins: By SSH Agent, I pasted the certificate X1 in the official way as follows, but it failed
stage('执行shutdown.sh命令') {
steps {
sshagent ('X1'SSH -p port adminuser@ip /${SERVER_B_TOMCAT_PATH}/bin/shutdown.sh
}
}
}Copy the code
So we follow the Internet
stage('执行shutdown.sh命令') {
steps {
sshagent (credentials: ['X1']) {SSH -p port adminuser@ip /${SERVER_B_TOMCAT_PATH}/bin/shutdown.sh
}
}
}Copy the code
In the mind that this can certainly, so the execution of the discovery or before the error
Host key verification failed.
!!!!!!!!!! Want to cry no tears!!!! In fact, Jenkins was started by Jenkins user, we just operated the ServerA adminUser user, so we entered the Jenkins user folder, found the.ssh folder, and generated credentials X2 in the same way. At this time, the private key in the credential is filled with the following private key of Jenkins user, then SSH Agent to try:
stage('执行shutdown.sh命令') {
steps {
sshagent (credentials: ['X2']) {SSH -p port adminuser@ip /${SERVER_B_TOMCAT_PATH}/bin/shutdown.sh
}
}
}Copy the code
“Host key verification failed.”
Urgent urgent urgent!!
Thought 3: Am I going in the wrong direction
Finally, with the help of my colleagues, I found the problem. The main reason was that I did not have a deep understanding of THE PRINCIPLE of SSH, which led to the failure in Host key verification but failed to respond that it was caused by SSH.
There are two ways to verify the host key when connecting to a machine through SSH. One is to change the SSH configuration StrictHostKeyChecking to No on the server. Another is to save the remote server’s public key signature under the current user who needs to log in remotely.
What does that mean?
Here’s a closer look at SSH:
SSH provides password login and public key login
(Source)
-
Password to login
If this is the first time to log in to each other’s server, the following message will appear:
$SSH user@host The Authenticity of host'the host (12.18.429.21)' can't be established. The RSA key fingerprint is: 98 2 e: d7: e0 🇩 🇪 9 f: ac: 67-28: c2: d: was that dwellest: behold d. Are you sure you want to continue connecting (yes/no)? Copy the code
Do you want to continue the connection?
The so-called “public key fingerprint” refers to the public key length is long (RSA algorithm is used in this case, the length is 1024 bits), so it is difficult to compare. Therefore, MD5 calculation is performed to convert the public key into a 128-bit fingerprint. Case is 98:2 e: d7: e0 🇩 🇪 9 f: ac: 67-28: c2: d: was that dwellest: behold d, compare again, would be easier.
A natural question is, how does the user know what the public key fingerprint of a remote host should be? The answer is no, remote hosts must post public key fingerprints on their web sites so users can check for themselves.
Suppose, after weighing the risks, the user decides to accept the public key of the remote host.
Are you sure you want to continue connecting (yes/no)? yesCopy the code
The system displays a message indicating that the host has been approved.
Warning: Permanently added 'the host, the 12.18.429.21' (RSA) to the list of known hosts.Copy the code
You are then asked for a password.
Password: (enter password)Copy the code
If the password is correct, you can log in.
-
A public key to login
Log in with a password, which must be entered every time, which is very troublesome. Fortunately, SSH also provides public key login, eliminating the need to enter a password.
The principle of “public key login” is very simple. The user stores his public key on a remote host. At login, the remote host sends the user a random string, which the user encrypts with his private key, and sends back. The remote host decrypts the shell using the public key stored in advance. If successful, the user is proved to be trusted and is allowed to log in to the shell without requiring a password.
This method requires that users must provide their own public keys. If you don’t have one, you can just use ssh-keygen to generate one:
$ ssh-keygenCopy the code
After running the above command, the system will display a series of prompts, you can press enter. One of the questions is whether to set a passphrase on the private key, which you can set here if you are concerned about the security of the private key.
After the command is executed, two new files id_rsa.pub and id_rsa are generated in the $HOME/. Ssh/ directory. The former is your public key and the latter is your private key.
To send the public key to the remote host, enter the following command:
$ ssh-copy-id user@hostCopy the code
Ok, from now on you login again, do not need to enter a password.
When the public key of the remote host is accepted, it is stored in the file $HOME/.ssh/known_hosts. The next time you connect to the host, the system will recognize that its public key is already stored locally and will skip the warning and prompt for the password. Every SSH user has their own known_hosts file, so I looked at the Jenkins user’s known_hosts file and found that there was no public key signature for ServerB. I copied a copy of ServerB’s public key signature from the ServerA AdminUser user and tried it again Principle refer to the following figure:
Successful screenshot:
So far, our Jenkins Pipeline is also a successful entry, the process is very bumpy, the outcome is good. The following is an introduction to the basic syntax of the assembly line, which is detailed on the official website. The following is only part of it.
5. Assembly line syntax
stages
Consisting of a series of one or more stage instructions, the stages section is where most of the “work” described in an assembly line is located. It is recommended that stages contain at least one stage instruction for each discrete part of a continuous delivery process, such as build, test, and deployment.
Required | Yes |
---|---|
Parameters | None |
Allowed | Only once, inside the pipeline block. |
The sample
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'}}}}Copy the code
steps
The Steps section, executed in the given stage directive, defines a sequence of one or more steps.
Required | Yes |
---|---|
Parameters | None |
Allowed | Inside each stage block. |
The sample
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'}}}}Copy the code
instruction
environment
The environment directive enforces a sequence of key-value pairs that will be defined as environment variables for all steps, or stage-specific steps, depending on where the environment directive is located in the pipeline. The directive supports a special helper method called Credentials (), which can be used to access predefined credentials through identifiers in the Jenkins environment. For credentials of type “Secret Text”, the credentials() ensures that the specified environment variable contains Secret Text content. For credentials of type “SStandard username and password”, specify the environment variable as username:password, And two additional environment variables are automatically defined: MYVARNAME_USR and MYVARNAME_PSW.
Required | No |
---|---|
Parameters | None |
Allowed | Inside the pipeline block, or within stage directives. |
The sample
pipeline {
agent any
environment {
CC = 'clang'
}
stages {
stage('Example') {
environment {
AN_ACCESS_KEY = credentials('my-prefined-secret-text')
}
steps {
sh 'printenv'}}}}Copy the code
The environment directive used in the top-level pipeline block will apply to all steps in the pipeline. An environment directive defined in a stage only applies the given environment variable to the steps in the stage. The Environment block has a helper method defined by credentials(), which can be used in the Jenkins environment to access predefined credentials through identifiers.
options
The ‘options’ directive allows pipeline-specific options to be configured from within the pipeline. Pipelines provide many of these options, such as buildDiscarder, but they can also be provided by plug-ins, such as timestamps.““`
Required | No |
---|---|
Parameters | None |
Allowed | Only once, inside the pipeline block. |
The available options
BuildDiscarder saves component and console output for a specific number of recent pipeline runs. For example: options {buildDiscarder(logRotator(numToKeepStr: '1'))}disableConcurrentBuilds do not allow simultaneous pipeline execution. Can be used to prevent simultaneous access to shared resources, etc. For example: options {disableConcurrentBuilds()} overrideIndexTriggers allows override the default processing of branch index triggers. If branch index triggers are disabled in a multi-branch or organization TAB, options {overrideIndexTriggers(true)} will only allow them to promote work. Otherwise, options {overrideIndexTriggers(false} will only disable the branch index trigger for the change job. SkipDefaultCheckout skips the default of checking out code from source control in the 'agent' directive. For example: Options {skipDefaultCheckout()} skipStagesAfterUnstable If the build state becomes UNSTABLE, skip this phase. For example: Options {skipStagesAfterUnstable()} checkoutToSubdirectory Automatically performs source control check-outs in subdirectories of the workspace. For example: options {checkoutToSubdirectory('foo')} timeout Sets the timeout time of pipeline operation, after which Jenkins will stop the pipeline. For example: options {timeout(time: 1, unit:'HOURS'Retry A specified number of times the entire pipeline is retried in the event of a failure. For example: options {retry(3)} timestamps Predate all console outputs generated by the pipeline at the same time as those issued by the pipeline. For example: options {timestamps()}Copy the code
Example
pipeline {
agent any
options {
timeout(time: 1, unit: 'HOURS')
}
stages {
stage('Example') {
steps {
echo 'Hello World'}}}}Copy the code
Specify a global execution timeout of one hour, after which Jenkins will abort the pipeline. A complete list of available options is awaiting completion of infra-1503 times.Copy the code
Phase option
stageCopy the code
The options directive is similar to the options directive in the pipeline root directory. Stage-level options, however, can only include steps such as Retry, timeout, or timestamps, or stage-related declarative options such as skipDefaultCheckout. In 'stage', the steps in the Options directive are called before entering the agent or checked when the condition appears.Copy the code
Optional phase options
SkipDefaultCheckout skips the default of checking out code from source control in the Agent directive. For example: Options {skipDefaultCheckout()} timeout Sets the timeout period for this phase, after which Jenkins will terminate the phase. For example: options {timeout(time: 1, unit:'HOURS')} Retry A specified number of retries in this phase on a failure. For example, options {retry(3)} timestamps Plan all console outputs generated at this stage to be issued at the same time. For example: options {timestamps()}Copy the code
The sample
pipeline {
agent any
stages {
stage('Example') {
options {
timeout(time: 1, unit: 'HOURS')
}
steps {
echo 'Hello World'}}}}Copy the code
** | The specifiedExample Phase execution timeout, after which Jenkins will abort pipeline operations. |
---|---|
parameter
The parameters directive provides a list of parameters that the user should supply when triggering the pipeline. The values of these user-specified parameters are provided to the pipeline step through the Params object, see the example for more information.
Required | No |
---|---|
Parameters | None |
Allowed | Only once, inside the pipeline block. |
The available parameters
String String parameters, for example: parameters {string(name:'DEPLOY_ENV', defaultValue: 'staging', description: ' '} booleanParam specifies a Boolean parameter, for example: parameters {booleanParam(name:'DEBUG_BUILD', defaultValue: true, description: ' ')}Copy the code
The sample
pipeline {
agent any
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to? ')
}
stages {
stage('Example') {
steps {
echo "Hello ${params.PERSON}"}}}}Copy the code
The trigger
The Triggers directive defines the automated method by which the pipeline is retriggered. For pipelining that integrates sources (such as GitHub or BitBucket), triggers may not be needed because web-based integration probably already exists. Currently available triggers are Cron, pollSCM, and Upstream.
Required | No |
---|---|
Parameters | None |
Allowed | Only once, inside the pipeline block. |
cron
Accept a Cron-style string to define the regular interval to retrigger the pipeline, as triggers {cron(‘H */4 * * 1-5’)}
pollSCM
Receive a Cron-style string to define a fixed interval in which Jenkins checks for new source code updates. If there are changes, the pipeline is retriggered. For example: Triggers {pollSCM(‘H */4 * * 1-5’)}
upstream
Accepts comma-separated work strings and thresholds. When any job in the string ends at a minimum threshold, the pipeline is refired. For example, triggers {upstream (upstreamProjects: ‘job1 is, job2’ threshold: Hudson. Model. The Result, SUCCESS)}
** | pollSCM Only available in Jenkins 2.22 and above. |
---|---|
The sample
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
triggers {
cron('H */4 * * 1-5')
}
stages {
stage('Example') {
steps {
echo 'Hello World'}}}}Copy the code
stage
Stage instructions are performed in the Stages section and should contain a single, virtually all of the actual work done by the flow lane is encapsulated in one or more stage instructions.
` `
Required | At least one |
---|---|
Parameters | One mandatory parameter, a string for the name of the stage. |
Allowed | Inside the stages section. |
The sample
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'}}}}Copy the code
tool
Defines the part of the tool that automatically installs and places the PATH. If agent None is specified, this operation is ignored.
Required | No |
---|---|
Parameters | None |
Allowed | Inside the pipeline block or a stage block. |
Support tools
maven
jdk
gradle
The sample
pipeline {
agent any
tools {
maven 'the apache maven - 3.0.1'
}
stages {
stage('Example') {
steps {
sh 'mvn --version'}}}}Copy the code
input
Stage’s input directive allows you to prompt for input using [input] step. After options are applied, stage will pause before entering the agent of stage or evaluating the when condition. If input is approved, stages will continue. Any parameters that are part of the input submission will be used for other stages in the environment. (Jenkins. IO/doc/pipelin…).
Configuration items
Message required. This is presented to the user when he submits the input. The optional identifier for idInput, which defaults to the stage name. On the OK 'input' form"ok"The optional text of the button. Submitter An optional comma-separated list of users or external group names that are allowed to submit input. Any user is allowed by default. SubmitterParameter Optional name of an environment variable. If it exists, set it with the submitter name. Parameters prompts the submitter to provide a list of optional parameters. See [parameters] for more information.Copy the code
The sample
pipeline {
agent any
stages {
stage('Example') {
input {
message "Should we continue?"
ok "Yes, we should."
submitter "alice,bob"
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to? ')
}
}
steps {
echo "Hello, ${PERSON}, nice to meet you."}}}}Copy the code
when
The when directive allows the pipeline to decide whether a phase should be executed based on a given condition. The WHEN directive must contain at least one condition. If the WHEN directive contains multiple conditions, all of the subconditions must return True for the phase to execute. This is the same as when subconditions are nested under allOf conditions (see the example below). More complex conditional structures can be built using nested conditions such as NOT, allOf, or anyOf.
Required | No |
---|---|
Parameters | None |
Allowed | Inside a stage directive |
Built-in conditions
Branch This phase is performed when the branch being built matches the branch given by the pattern, for example: when {branch'master'}. Note that this only applies to multi-branch pipelining. Environment Perform this step when the specified environment variable is a given value, for example: when {environment name:'DEPLOY_TO', value: 'production'} expression when the specified Groovy expression is evaluated astrueWhen, for example: when {expression {returnParams. DEBUG_BUILD}} NOT This phase must include a condition such as when {not {branch when the nested condition is an error'master'}} allOf When all nested conditions are correct, this phase must contain at least one condition, for example: when {allOf {branch'master'; environment name: 'DEPLOY_TO', value: 'production'}} anyOf This phase must contain at least one condition when at least one of the nested conditions is true, for example: when {anyOf {branch'master'; branch 'staging'By default, if agents of a stage are defined, the when condition for that stage will be evaluated after agents of that stage are entered. However, you can change this option by specifying the beforeAgent option in the WHEN block. If beforeAgent is set totrue, then the when condition will be evaluated first, and the Agent will enter only when the when condition is verified to be true.Copy the code
The sample
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
branch 'production'
}
steps {
echo 'Deploying'}}}}Copy the code
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
branch 'production'
environment name: 'DEPLOY_TO', value: 'production'
}
steps {
echo 'Deploying'}}}}Copy the code
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
allOf {
branch 'production'
environment name: 'DEPLOY_TO', value: 'production'
}
}
steps {
echo 'Deploying'}}}}Copy the code
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
branch 'production'
anyOf {
environment name: 'DEPLOY_TO', value: 'production'
environment name: 'DEPLOY_TO', value: 'staging'
}
}
steps {
echo 'Deploying'}}}}Copy the code
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
expression { BRANCH_NAME ==~ /(production|staging)/ }
anyOf {
environment name: 'DEPLOY_TO', value: 'production'
environment name: 'DEPLOY_TO', value: 'staging'
}
}
steps {
echo 'Deploying'}}}}Copy the code
Jenkinsfile (Declarative Pipeline)
pipeline {
agent none
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
agent {
label "some-label"
}
when {
beforeAgent true
branch 'production'
}
steps {
echo 'Deploying'}}}}Copy the code
parallel
Phases of a declarative pipeline can declare multiple nested phases within them, which execute in parallel. Note that a phase must have only one Steps or PARALLEL phase. Nested stages themselves cannot contain further parallel stages, but the other stages behave the same as any other stage. Any phase that contains PARALLEL cannot contain the Agent or Tools phase because they have no associated Steps. In addition, by adding failFast True to stages containing PARALLEL, you can force all parallel stages to terminate if one of the processes fails.
The sample
pipeline {
agent any
stages {
stage('Non-Parallel Stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('Parallel Stage') {
when {
branch 'master'
}
failFast true
parallel {
stage('Branch A') {
agent {
label "for-branch-a"
}
steps {
echo "On Branch A"
}
}
stage('Branch B') {
agent {
label "for-branch-b"
}
steps {
echo "On Branch B"
}
}
}
}
}
}
Copy the code
steps
A declarative pipeline may use all available steps documented in the pipeline step reference, which contains a complete list of steps with the steps listed below added, which are only supported in a declarative pipeline.
The script
Script steps require [Scripted – Pipeline] blocks and are executed in a declarative pipeline. The “scripting” step in the declarative pipeline is not necessary for most use cases, but it can provide a useful “escape hatch.” Script blocks of non-trivial size and/or complexity should be moved to shared libraries.
The sample
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'
script {
def browsers = ['chrome'.'firefox']
for (int i = 0; i < browsers.size(); ++i) {
echo "Testing the ${browsers[i]} browser"
}
}
}
}
}
}
Copy the code
Scripted pipeline
A scripted pipeline, like a declarative pipeline, is built on a subsystem of the underlying pipeline. In contrast to declarative, scripted pipelining is actually a generic DSL built by Groovy [2]. Most of the functionality provided by the Groovy language is available to users of scripted pipelining. This means that it is a very expressive and flexible tool through which to write continuous delivery pipelines.
Flow control
Scripting pipelining starts at the top of Jenkinsfile and executes sequentially down, just like most traditional scripts in Groovy or other languages. Therefore, providing flow control depends on Groovy expressions such as if/else conditions, such as:
Jenkinsfile (Scripted Pipeline)
node {
stage('Example') {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'}}}Copy the code
Another approach is to use Groovy’s exception-handling support to manage scripted pipeline flow control. When steps fail, for whatever reason, they throw an exception. Error handling must use Groovy’s try/catch/finally block, for example:
Jenkinsfile (Scripted Pipeline)