There are few complete configurations of multi-branch pipelining on the web. I hope this article can help you
Introduction to the
- Build front and back end automation workflows from zero to one
- Enterprise-level practice notes
- Best practices (can be progressively improved upon)
origin
As there was no notification of the successful deployment of Jenkins configuration in the company, I finally started the Jenkins configuration of the company after several days of learning Jenkins. As a result, after I installed dingtalk plug-in and restarted automatically, I found that the construction project data of the previous supervisor’s configuration was lost, which gave me a chance to practice. Hence the following poignant journey from zero to one.
Install and run Jenkins in Docker
This assumes that docker is already installed on your server
The image used is Jenkinsci/BlueOcean, which is a Jenkins stable and continuous maintenance of the image source, itself integrated with Blue Ocean and other plug-ins, very convenient.
Pull the mirror
docker pull jenkinsci/blueocean
Copy the code
Run Jenkins
docker run -idt --name kmywjenkins -p 9090:8080 -p 60000:50000 -v jenkins-data:/var/jenkins_home -v /data/web-data/docker.sock:/var/run/docker.sock jenkinsci/blueocean
Copy the code
Parameter Description:
-idt Creates an analog terminal running container in interactive mode
–name Alias of the container
-p Specifies the port that the container maps to the host. -> Host port: container port
-v jenkins-data:/var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home /var/jenkins_home This parameter is used to establish a connection between the container and the host docker service
– v/data/web – data/docker. The sock: / var/run/docker. The sock will be the container of data retention in hosting the directory, so even if the container breakdown, the inside of the configuration and the task will not be lost
It should be noted that Jenkins is run as Jenkins user by default in Docker. If you want to use Jenkins user root, you can add parameter -u root. Root is not specified in this example.
Access the Jenkins Docker container
Sometimes you need to enter the Jenkins container to execute some commands, which can be accessed using the Docker exec command, for example, docker exec it [containerid] bash
To manually restart Jenkins, run the following command: docker restart [containerid]
Jenkins basic configuration
Through the above steps, if normal go here, you can visit http://121.41.16.183:9090/ through the following address, IP address for the server address.
Unlock Jenkins
Enter a command to get the unlock token, docker exec kmywjenkins cat/var/jenkins_home/secrets/initialAdminPassword
Enter the corresponding token in the browser to unlock:
Create the credentials
You need credentials to connect to git repository and SSH server. You can create credentials in credential management and directly select credentials to use. The following uses the credentials required to connect to Git and SSH as an example:
- The version management tool I use is Gitte. For example, gitte is also used for other version management tools
Select Username with password
Username and password Specifies the password for logging in to Gitte
The ID is the unique identification of the credential and can be customized, and the credential is later referenced by ID in JenkinsFile
Result after configuration
-
SSH connection to the server requires a key. We first generate a pair of public and private keys on the server, and then copy the private key and fill it in
Select SSH Username with private key
Username is the Username used to connect to the server, such as Jenkins
Select Enter Directly from the Private Key field, click Add, and paste the Private Key you just copied
Result after configuration
Create a multi-branch pipeline
The previous Jenkins task was created in the mode of FreeStyle, which was not flexible enough and the interface was not clean enough. Here, Declarative Pipeline was used to create the task. Multiple branches can be independently constructed, which is convenient for future expansion.
We use BlueOcean to complete the CI/CD work here. BlueOcean is a set of UI interface redesigned by Jenkins’ team for Jenkins Pipeline from the perspective of user experience. Still compatible with previous FressStyle jobs, BlueOcean has the following features:
- Complex visualizations of continuous delivery (CD) pipelines allow quick and intuitive understanding of Pipeline status
- Pipelines can be created intuitively through the Pipeline editor
- BlueOcean shows what pipelines need to be aware of to facilitate exception handling and improve productivity when intervention is needed or problems are quickly located
- Local integration for branch and pull requests maximizes developer productivity when collaborating with other people on GitHub or Bitbucket.
If the Jenkinsci/BlueOcean image is installed, blueOcean is integrated by default. If the image is not installed, you can go to plug-in management to install the corresponding plug-in.
Click to open Blue Ocean, and you can see that two pipelines have been created, namely front end and back end, which need different tools. It will be mentioned later how to create the pipeline
Click Create Pipeline
Our company uses Gitte, so select Git and fill in the address of the repository to be connected. We have already created the credentials to connect to the Git repository, so you can directly select it. If not, you can directly edit it in the form below, and finally click Create Pipeline
Jenkins will scan the repository and the branch with JenkinsFile will be detected. JenkinFile is the configuration file of the multi-branch pipeline, using Groovy syntax. You can click create pipeline directly. Jenkins will automatically create a JenkinsFile for your project
You can now visually edit the stages and steps you want to perform by adding a package stage with a step that prompts you to start packing and click Save
The JenkinsFile will be uploaded to Git, and a build task will be performed based on the JenkinsFile. Currently, there is only one build step, which is a prompt to start packing
I don’t know why I got stuck in this place, so I directly create and edit JenkinsFile in vscode. This way is more flexible, and I prefer this way. Next, I will briefly introduce the basic syntax of JeninsFile, including only the basic syntax used in this project, which is basically enough for small and medium-sized enterprises to build.
JenkinsFile basic syntax
Just understand the general syntax, and the usage will be explained later
// Front-end project JenkinsFile configuration. Back-end project configuration is slightly different, which will be explained later
pipeline {
agent any
environment {
HOST_TEST = '[email protected]'
HOST_ONLINE = '[email protected]'
SOURCE_DIR = 'dist/*'
TARGET_DIR = '/data/www/kuaimen-yunying-front'
}
parameters {
choice(
description: 'Which environment do you need to choose for deployment? '.name: 'env'.choices: ['Test environment'.'Online Environment']
)
string(name: 'update'.defaultValue: ' '.description: 'What is this update? ')
}
triggers {
GenericTrigger(
genericVariables:[[key: 'ref'.value: '$.ref']],causeString: 'Triggered on $ref'.token: 'runcenter-front-q1w2e3r4t5'.tokenCredentialId: ' '.printContributedVariables: true.printPostContent: true.silentResponse: false.regexpFilterText: '$ref'.regexpFilterExpression: 'refs/heads/' + BRANCH_NAME
)
}
stages {
stage('Get git commit message') {
steps {
script {
env.GIT_COMMIT_MSG = sh (script: 'git log -1 --pretty=%B ${GIT_COMMIT}'.returnStdout: true).trim()
}
}
}
stage('pack') {
steps {
nodejs('nodejs - 12.16') {
echo 'Start installing dependencies'
sh 'yarn'
echo 'Start packing'
sh 'yarn run build'
}
}
}
stage('deployment') {
when {
expression {
params.env == 'Test environment'
}
}
steps {
sshagent(credentials: ['km-test2']) {
sh "ssh -o StrictHostKeyChecking=no ${HOST_TEST} uname -a"
sh "scp -r ${SOURCE_DIR} ${HOST_TEST}:${TARGET_DIR}"
sh 'echo "Deployment successful ~"'
}
}
}
stage('release') {
when {
expression {
params.env == 'Online Environment'
}
}
steps {
sshagent(credentials: ['km-online']) {
sh "ssh -o StrictHostKeyChecking=no ${HOST_ONLINE} uname -a"
sh "scp -r ${SOURCE_DIR} ${HOST_ONLINE}:${TARGET_DIR}"
sh 'echo '>
}
}
}
}
post {
success {
dingtalk (
robot: '77d4c82d-3794-4583-bc7f-556902fee6b0'.type: 'MARKDOWN'.atAll: true.title: 'You have new information, please check.'.text:[
'# operation management system release notice '.The '-'.'#### ** belongs to: front-end **'."#### ** Build task: ${env.build_display_name}**"."#### **Git commit: ${env.git_commit_msg}**"."#### ** ${params.update}**"."#### ** Deployment environment: ${params.env}**".'#### ** Build result: success **'])}}}Copy the code
The pipeline must be in the outermost layer
The agent defines the environment in which it is executed. The default is any
Stages: A label block that identifies a construction process. Its child nodes are stages
Steps Execution steps
Post executes some logic after all phases are completed
When controls whether the phase is executed
Environment Environment variables, variables defined here, can be accessed anywhere in JenkinsFile
The build tool used by the Tools project that declares the tools already defined in the system configuration, such as Maven
Parameters defines parameters that can be entered or selected by the user
Post is executed after the build completes, with success, Failure, and success, and this example will issue a pinning notification at SUCCESS (when the build is successful)
CI/CD process
Since our company has a small technical team, the CI/CD process is not so complicated, including Code inspection, automated testing, Code Review and other processes. I will briefly explain the front-end and back-end CI/CD process I built and why I built it so.
The front end
Two build methods are provided, one is code upload automatic build, the other is parameterized build, can be deployed to a test environment or online environment.
Automatic builds are deployed to the test environment by default. Due to the importance of the online environment, automated builds can be risky, so manual intervention is required to select parameters for the build.
- If a parameterized build is performed, this step is to manually select the environment to build and start the build
- Install dependencies
- packaging
- Uploading to the server
- If the nail notification is successfully initiated
The back-end
All projects on the back end are in a Git repository, so there is no automated build
- Parameterized builds allow you to choose which environment to build, which projects to package, and whether full packaging is required
- Clearing old data
- packaging
- Uploading to the server
- Kill the corresponding process
- Start the corresponding process
- If the nail notification is successfully initiated
Each step is explained in detail, along with possible pitfalls
Automatic trigger build
What is an auto-triggered build
When we commit the new code to the Git repository, Jenkins will automatically start the task of building the project that has already been configured
The principle of
Configure a Webhook address for the Jenkins server in the Git repository and request this address when the Git repository changes so that Jenkins can be notified and start the build task
configuration
-
We need to install a plug-in Multibranch Scan Webhook Trigger first, you can enter the plug-in management search to install
-
Go to the configuration page of the project, select Scan by Webhook, and enter the user-defined token. Ensure that the token is unique and does not conflict with other projects
-
Filter branch
This is a multi-branch assembly line. Jenkins will check out all branches containing Jenkinsfile by default. If Webhook is configured, the construction task of corresponding branches will be automatically triggered. Sometimes we just want the master to change before we build the task. In this case, we use the filter branch configuration. Go to the project configuration, find the Add button in the branch source Git TAB and click
Select filter by name (wildcard support), or you can filter by name (regular expression support), the effect is the same, but the filter format is not quite the same, I put in the corresponding place master, that is, only the master branch is searched, so we have the desired effect.
-
Go to the remote repository (Gitte in my case), click on Webhooks, then click On Add Webhooks
-
Fill in the URL, the IP address is the server Jenkins deployed, the token is the one we just set,
/multibranch-webhook-trigger/invoke
If it’s a fixed address, click Add -
A request will be automatically sent, that is, the one we just filled in. If the request is as follows, the configuration is successful and the corresponding build task will be automatically executed
Automatic packing
The front end
If YARN is used for dependency installation and packaging, configure the NodeJS environment first
- Go to plug-in management search
nodejs
To install - Go to global Tool configuration and add the following configuration. You can customize the alias in the recommended format
Nodejs version number
, the project usesyarn
And so onGlobal npm package to install
, added configuration items that will be installed automatically during buildyarn
If it is NPM, you can ignore this configuration - Jenkinsfile configuration front-end is relatively simple
pipeline {
stage('pack') {
steps {
// The execution environment, nodejs-12.16 is the alias we just configured. Another way is to configure the execution environment in agent, and configure the package used in Tools
nodejs('nodejs - 12.16') {
echo 'Start installing dependencies'
sh 'yarn'
echo 'Start packing'
sh 'yarn run build'}}}}Copy the code
Back end (Java)
pipeline {
tools {
maven 'Maven3.6.3'
}
parameters {
// Provide the server option to deploy
choice(
description: 'Which environment do you need to choose for deployment? '.name: 'env'.choices: ['Test environment'.'Online Environment'])// Provide options for building modules
choice(
description: 'Which module do you need to select to build? '.name: 'moduleName'.choices: ['kuaimen-contract'.'kuaimen-core'.'kuaimen-eureka-server'.'kuaimen-manage'.'kuaimen-member'.'kuaimen-order'.'kuaimen-shop'.'tiemuzhen-manage']
)
booleanParam(name: 'isAll'.defaultValue: false.description: 'Do I need the full amount (including clean && Build)?')
string(name: 'update'.defaultValue: ' '.description: 'What is this update? ')
}
stages {
stage('Full purge of old data... ') {
when {
expression {
params.isAll == true
}
}
steps {
echo "Begin full purge."
sh "mvn package clean -Dmaven.test.skip=true"
}
}
stage('Full package application') {
when {
expression {
params.isAll == true
}
}
steps {
echo "Start packing in full."
sh "mvn package -Dmaven.test.skip=true"
echo 'Packed successfully'
}
}
stage('Clean up old data... ') {
when {
expression {
params.isAll == false
}
}
steps {
echo Start clearing the ${params.moduleName} module
sh "cd ${params.moduleName} && mvn package clean -Dmaven.test.skip=true"
}
}
stage('Package your app') {
when {
expression {
params.isAll == false
}
}
steps {
echo "Start packing ${params.moduleName} module"
sh "cd ${params.moduleName} && mvn package -Dmaven.test.skip=true"
echo 'Packed successfully'}}}}Copy the code
parameters
${params.isAll} = ${params.isAll} = ${params.isAll} = ${params.isAll}
When >expression If the parameter in the expression is not true, the stage is skipped otherwise
MVN provides this environment by default in the system configuration. Go to the system global Tool configuration and add the following configuration (similar to nodejs).
This way of quoting
tools {
maven 'Maven3.6.3'
}
Copy the code
Automated deployment
The front end
pipeline {
agent any
environment {
HOST_TEST = '[email protected]'
HOST_ONLINE = '[email protected]'
SOURCE_DIR = 'dist/*'
TARGET_DIR = '/data/www/kuaimen-yunying-front'
}
stage('deployment') {
when {
expression {
params.env == 'Test environment'
}
}
steps {
sshagent(credentials: ['km-test2']) {
sh "ssh -o StrictHostKeyChecking=no ${HOST_TEST} uname -a"
// Upload the packaged file to the server
sh "scp -r ${SOURCE_DIR} ${HOST_TEST}:${TARGET_DIR}"
sh 'echo "Deployment successful ~"'
}
}
}
stage('release') {
when {
expression {
params.env == 'Online Environment'
}
}
steps {
sshagent(credentials: ['km-online']) {
sh "ssh -o StrictHostKeyChecking=no ${HOST_ONLINE} uname -a"
sh "scp -r ${SOURCE_DIR} ${HOST_ONLINE}:${TARGET_DIR}"
sh 'echo '>
}
}
}
}
}
Copy the code
Environment specifies global variables that can be referenced directly elsewhere
Sshagent is used to connect to the server, and the ssh-Agent plugin needs to be installed. The credentials for connecting to the server are the ids we taught you to create at the beginning
The back-end
pipeline {
agent any
environment {
HOST_TEST = '[email protected]'
TARGET_DIR = '/data/www/kuaimen-auto'
HOST_ONLINE = '[email protected]'
}
tools {
maven 'Maven3.6.3'
}
stage('Deploy the application') {
when {
expression {
params.env == 'Test environment'
}
}
steps {
echo "Start deploying the ${params.modulename} module"
sshagent(credentials: ['km-test2']) {
sh "ssh -v -o StrictHostKeyChecking=no ${HOST_TEST} uname -a"
// Upload the packaged file to the server
sh "cd ${params.moduleName}/target && scp *.jar ${HOST_TEST}:${TARGET_DIR}/${params.moduleName}"
// Match the Java process and kill it
sh "ssh -o StrictHostKeyChecking=no ${HOST_TEST} \"uname; ps -ef | egrep ${params.moduleName}.*.jar | egrep -v grep | awk '{print \\\$2}' | xargs -r sudo kill -9\""
// Start the process
sh "SSH -o StrictHostKeyChecking = no ${HOST_TEST} \" nohup/data/apps/jdk1.8 / bin/Java - jar ${TARGET_DIR} / ${params. ModuleName} / ${params. ModuleName} - 0.0.1 - the SNAPSHOT. Jar -- spring. Profiles. The active = test > / dev/null 2 > &1 & \ ""
sh 'echo "Deployment successful ~"'
}
echo 'Deployment successful'
}
}
stage('Publish the app') {
when {
expression {
params.env == 'Online Environment'
}
}
steps {
echo "Start publishing the ${params.moduleName} module"
sshagent(credentials: ['km-online']) {
sh "ssh -v -o StrictHostKeyChecking=no ${HOST_ONLINE} uname -a"
sh "cd ${params.moduleName}/target && scp *.jar ${HOST_ONLINE}:${TARGET_DIR}/${params.moduleName}"
sh "ssh -o StrictHostKeyChecking=no ${HOST_ONLINE} \"uname; ps -ef | egrep ${params.moduleName}.*.jar | egrep -v grep | awk '{print \\\$2}' | xargs -r sudo kill -9\""
sh "SSH -o StrictHostKeyChecking = no ${HOST_ONLINE} \" nohup/data/apps/jdk1.8 / bin/Java - jar ${TARGET_DIR} / ${params. ModuleName} / ${params. ModuleName} - 0.0.1 - the SNAPSHOT. Jar -- spring. Profiles. The active = dev > / dev/null 2 > &1 & \ ""
sh 'echo '>
}
echo 'Released successfully'}}}Copy the code
Awk ‘{print \\$2}’ {print \\$2}’ {print \\$2}
Notification is initiated after deployment is complete
The main principle is to create a Webhook robot in the nail group, and then fill the webhook address into the configuration item of DingTalk plug-in, and finally configure the following in JenkinsFile:
pipeline {
stage('Get git commit message') {
steps {
script {
// Assign the git commit to GIT_COMMIT_MSG
env.GIT_COMMIT_MSG = sh (script: 'git log -1 --pretty=%B ${GIT_COMMIT}'.returnStdout: true).trim()
}
}
}
post {
success {
dingtalk (
robot: '77d4c82d-3794-4583-bc7f-556902fee6b0'.type: 'MARKDOWN'.atAll: true.title: 'You have new information, please check.'.text:[
'# operation management system release notice '.The '-'.'#### ** Owning: back end **'."#### ** Build task: ${env.build_display_name}**"."#### ** ${params.update}**"."#### ** Deployment environment: ${params.env}**".'#### ** Build result: success **'])}}}Copy the code
Git commit ID: git commit ID: git commit ID: git commit ID: git commit ID: git commit ID: git commit ID: git commit ID: git commit ID
Robot indicates the ID of the robot. Add the following configuration items in system configuration
Webhook is available when the robot is created
How to create a stapling robot
Go to Group Settings -> Smart Group Assistant
Select custom robot, after configuration can see the webhook address
Begin to build
After the above configuration, we have completed the automatic build configuration of front-end and back-end. Next, we will explain how to trigger the build respectively
The front end
- Submitting code to master will automatically execute the build task and deploy it to the test environment. After successful deployment, an alert will be sent to the peggroup
- Parametric constructionClick on the
Build with Parameters
, select the corresponding parameters for construction. The online environment must be constructed in this way to ensure certain security
The back-end
The back end is only configured with parametric builds, for the reason stated earlier, choosing the environment and module to build
Build with Blue Ocean (recommended)
Click to open Blue Ocean
Select the branch to build
This is similar to Build with Parameters, but the interface is much nicer and cleaner. After selecting it, click Run to start building
The result of the build, which is very intuitive, is that the build was successful by color, or red if it failed
The rollback
You can see the history build in the Blue Ocean activity bar. Click the button below to rebuild the history item, i.e. rollback
Write in the last
It took a lot of time, but I felt a little sense of achievement that I had perfected the engineering process of the company. Now I can write codes happily and leave the automation to Jenkins.
Will this record down one is convenient later refer to at any time, there is a hope to let friends less step on some pits, over ~
The appendix
Jenkins Official Documentation
BlueOcean Practical Multi-branch Pipeline Construction (Jenkins)
Complete Jenkins Pipeline Tutorial for Beginners [FREE]
Jenkins builds a strong front-end automation workflow
Jenkins: Add SSH global credentials
Jenkins automatic notification after release
Use Generic Webhook Trigger to Trigger Jenkins multi-branch pipelining automation build
Jenkins Pipeline single quotes, double quotes, and escape characters
Jenkins Blue Ocean
How to Execute Linux Commands on Remote System over SSH