preface
This is a multi-branch automatic deployment service based on Jenkins
It is recommended to read the preparation first
A dragon! CI/CD creates an engineering service environment for a small front-end team
1 Jenkins installation
Now we will use Docker to install Jenkins
- Let’s download jenkinsci/ BlueOcean image first
#Running this command will view Jenkins image resources in the repository
docker search jenkins
#When you run it, you'll see something like thisNAME DESCRIPTION STARS OFFICIAL AUTOMATED jenkins Official Jenkins Docker image 4863 [OK] jenkins/jenkins The leading open source automation server 2154 jenkinsci/blueocean https://jenkins.io/projects/blueocean 544 jenkinsci/jenkins Jenkins Continuous Integration and Delivery... 382 Jenkins/JNLP -slave a Jenkins agent which can connect to Jenkins... 129 [OK] Jenkinsci/jnLP-slave A Jenkins slave using JNLP to establish conn... 126 [OK] jenkinsci/slave Base Jenkins slave docker image 65 [OK] jenkins/slave base image for a Jenkins Agent, Which includ... 43 [OK] jenkinsci/ssh-slave A Jenkins SSH slave docker image 42 [OK] ···#We choose to download Jenkinsci/BlueOcean.
docker pull jenkinsci/blueocean
#After downloading, we can run the Docker images command to view the local image
docker images
Copy the code
- Generate containers for Jenkins applications
docker run \
--name jenkins \
-d \
-it \
-p 8080:8080 \
-p 50000:50000 \
-v jenkins:/var/jenkins_home \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /etc/localtime:/etc/localtime:ro \
jenkinsci/blueocean
Copy the code
Let’s take a look at what the command line means
- ** Docker run: ** Creates a new container
- **–name Jenkins: ** Name this container Jenkins.
- **-d: ** This parameter allows the container to run in the background
- -i runs the container in interactive mode. -t reassigns a pseudo-input terminal to the container
- -p: indicates port mapping. The format is host (host) port: container port
- -v: specifies a volume to persistently store the data of /var/jenkins_HOME of the container to the volume named Jenkins, which is centrally managed by the host Docker
- – v/var/run/docker. The sock: / var/run/docker. The sock: Jenkins will use docker container to do agent, necessary cache them down
- – v/etc/localtime: / etc/localtime: ro: synchronous containers and host, ps: this bad thing on the MAC
- Jenkinsci/BlueOcean: This is an image file. Docker will create a container from this image file. If it doesn’t exist locally, it will automatically pull the latest version online
We have successfully created a container to run a Jenkins application
After running successfully, if you are Ali Cloud, you need to add the instance’s security group rules
Because my Jenkins is the 8080 proxy to the host, so set ali cloud host
So you can see Jenkins’ launch page using your host’S IP :8080
Check the Jenkins container log for this password run, which can be seen at the end
docker logs jenkins ************************************************************* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * ************************************************************* Jenkins initial setup is required. An admin user has been created and a password generated. Please use the following password to proceed to installation: cb619dcc57574a6592d96asdasdasjasd This may also be found at: /var/jenkins_home/secrets/initialAdminPassword ************************************************************* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *Copy the code
Wait patiently for the installation to complete
Create the first administrative user
Boss, I’ve got Jenkins in sight!
Take some time to read Jenkins’ introductory tutorial, which focuses on tasks, declarative and scripted pipelining, and pipelining steps
2 Jenkins connects to Gitee
- Installing a plug-in
We need Jenkins to connect to a Git repository, gitee, GitLab, Github, etc. We need to install a few plugins
- Gitee
- Multibranch Scan Webhook Trigger
- Generic Webhook Trigger
Once installed, Gitee configuration is configured globally. See the official tutorial **
- The SSH connection to gitee is configured
Docker < p style = "max-width: 100%; clear: both; min-width: 100%; Generate SSH key pairs (see figure)Copy the code
SSH > config > config > config > config > config > config > config
cd ~/.ssh/
vi config
#Copy the following
-----------------------------------------------
Host gitee.com
Preferredauthentications publickey
IdentityFile ~/.ssh/jenkins-gitee
-----------------------------------------------
Copy the code
In this way, the SSH connection can be successfully executed
The generated public key is copied on gitee, and the personal configuration is that
Create Jenkins link to Gitee credentials, of course, can also be added when the new pipeline is created below
3 multi branch pipeline
It is better to set more than 3 global actuators when creating pipeline tasks, otherwise our warehouse will fail due to insufficient resources when triggering multi-branch builds
Now let’s start the new task
Set branch source and Webhook token according to the figure
Once the storage is set up, the first time the repository code is automatically pulled to perform the build task
Click the status, you can see that the branch task is pulled away. The feature branch and Hoxfix branch of our project are not regarded as construction tasks
This is the pipeline task step for the dev branch
Since we installed Jenkinsci/BlueOcean, it was easy for us to visualize the task
Click Blue Ocean in the navigation on the left
Click on the dev, Release, and Master branches respectively and see
Note that our branch will automatically continue to be delivered when dev and Release are started, and there will be a select button for master, since master is a production environment and we usually need to deploy manually
The pipelines are generated and run based on jenkinsfiles for each branch
4 webhook set
Now let’s set up the WebHook trigger.
WebhookURL: HTTP: your jenkinsurl/multibranch – webhook – trigger/invoke? token=vueci
Now let’s try to trigger webhook. For convenience, we will not create a new function branch and directly do push in the DEV branch
Let’s make some changes to jenkinsFile
- Add new code, add, commit, push
Echo "test wehook"Copy the code
- Looking at our Jenkins Blue-Ocean interface, the dev branch task is triggered
After the success, click the pre-build step to see the echo information just added
Of course, you can also create a new branch feature-* and push it to the remote repository to request merging into the Release branch. After merging successfully, the release task will also be triggered
5 Introduction to pipeline stage
The above process is generally completed, we now look at the pre-build, build-parellel on pipeline: [build – dev | build – master | build – release], artifacts – the manage, deliver and deloy
These are the stages of pipeline, which we customized in Jenkinsfile. Oh, there seems to be no test step, because I don’t do automatic test here
Before setting up the pipeline stage, what do we need to do
Pre-build: Work done before packing
- Set the cache directory and files
- Have you recently built, built a package that calls the cache directly, and just redeploy it without having to rebuild anything
- Determine whether you need to reinstall the node_modules package dependencies
The build – env: packaging
- Run the packaged script by branch
Artifacts manage: Artifacts management
- The built resources /dist are packed and compressed, stored in a cache directory or warehouse, and archived
Deliver: delivery
- This step is used in the development environment, test environment, and pre-release environment, where the packaged code is automatically delivered
Deploy: deploy
- This is for the production environment, and when it’s packaged, it sets up a button that you can manually click to run the deployment live script
6 Jenkinsfile,
Jenkins generated the corresponding assembly line task by reading Jenkinsfile, the root directory of our project. This content is quite large, let’s make it easy to read by using the directory
Now we need to look at the pipeline configuration file Jenkinsfile, or to be verbose, to the official tutorial sweep several times tutorial + do a tutorial, to the official tutorial sweep several times tutorial + do a tutorial, to the official tutorial sweep several times tutorial + do a tutorial, important things I said three times, I hope you can do a boss
1 Jenkinsfile preview
Let’s start with Jenkinsfile
pipeline {
agent any // Agent node, any tells Jenkins that any available agent can execute
environment {
Name = 'Eric' // This is a custom top-level pipeline global variable
}
options {
disableConcurrentBuilds() // Limit concurrency in a multi-branch pipeline
}
stages {
stage('pre-build') { // Custom step pre-build
when {
anyOf { // Multiple branches of judgment, one true pass
branch 'dev'
branch 'release'
branch 'master'
}
}
agent { // The delegate phase runs in a proxy docker NodeJS container
docker {
image 'the node: 10.21.0'
// If you want to run code in a docker container, but also want to use the same nodes or workspace defined by the pipeline, you must set this
reuseNode true
}
}
steps {
sh "printenv"
echo "pre-build"
echo "test webhook"
}
}
stage('build-env') {
when {
anyOf {
branch 'dev'
branch 'release'
branch 'master'}}// If set to true then any of the concurrent steps fail then all fail,
// This is not needed since we are using it to make build choices when different branch tasks are triggered
failFast false
parallel {
stage('build-dev') {
when {
// beforeAgent indicates that the agent is entered only if the condition of when is right, but not if it is wrong
// It can speed up the assembly line
beoreAgent true
branch 'dev'
}
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
steps {
echo "build-dev"
}
}
stage('build-release') {
when {
beforeAgent true
branch 'release'
}
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
steps{
echo "build-release"
}
}
stage('build-master') {
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
when {
beforeAgent true
branch 'master'
}
steps {
echo "build-master"
}
}
}
}
stage("artifacts-manage"){
steps {
echo "artifacts"
}
}
stage('deliver') {
when {
beforeAgent true
anyOf {
branch 'dev'
branch 'release'
}
}
steps {
echo "start deliver"
}
}
stage('deploy') {
when {
beforeAgent true
branch 'master'
}
steps {
// This generates a button that we use to publish manually
input message: "Should you deploy?"
echo "start deloy"}}}// Post is the pipeline running result state, we will set the email notification here later
post {
changed{
echo 'I changed! '
}
failure{
echo 'I failed! '
}
success{
echo 'I success'
}
always{
echo 'I always'
}
unstable{
echo "unstable"
}
aborted{
echo "aborted"}}}Copy the code
Our script is a declarative pipeline, which is clearer and more suitable for generating blue-Ocean visualization than scripted processes
To read the above code, you must know the syntax of pipeline pipeline:
- Pipeline: represents the entire pipeline, including the logic of the entire pipeline
- The environment:The directive formulates a sequence of key-value pairs that will be defined as environment variables for all steps, or as stage-specific steps, depending on which step
environment
The position of an instruction in a pipeline. - **stages: A container with at least one stage in the stages section
- **stage: ** pipeline stage, must be named, stage (” name “)
- **steps: ** represents one or more steps in a phase. A stage has one and only steps
- Agent: Specifies the execution position of the pipeline. Each stage in the pipeline must be executed somewhere (physical machine, virtual machine, Docker container). The Agent part is responsible for specifying the execution environment. The Agent under Pipeline specifies the execution environment of all stages
- The when:Instructions allow the pipeline to decide whether a phase should be executed based on a given condition.
- BeforeAgent refers to entering the agent, if the condition of WHEN is right, it is not entered, improving the build speed
- Branch This applies only to multi-branch pipelines.
- expressionThis phase is performed when the specified Groovy expression is evaluated to true, for example:
when { expression { return true } }
- anyofWhen at least one nested condition is true, this phase must contain at least one condition, for example:
when { anyOf { branch 'master'; branch 'staging' } }
- Post: is the running result status of pipeline, we will set email notification here later
- The parallel:Phases of a declarative pipeline can declare multiple nested phases within them, which execute in parallel. Note that there must be only one phase
steps
或parallel
Phase.
Please refer to the official documentation for more complete information
2 the pre – build stage
Next we’ll start scripting the pre-build step la la la la
Pre-build features to be implemented
- Determine and create & update cache directories & files
- Determine if it has been recently built and set the rollback flag file to be redeployed for subsequent phases of the build to skip execution and call the cache directly
- By comparing the cache of the package with the query of the content, determine whether the old node_modules need to be deleted and run
npm install
Reinstall package dependencies
- To start, we set several pipeline global variables
pieline{
environment {
cacheDir = "stage" // Define the directory name for the cache
cachePackage = "${cacheDir}/package.json" // Define cached package.json
cacheCommitIDFile = "${cacheDir}/.commitIDCache.txt" // Cache the successfully packaged commitID here
artifactsDir = "${cacheDir}/artifacts" // The directory of the artifact cache, we put the successful artifact here
resetFlagFile = "${cacheDir}/.resetFile" // Roll back the marked file
cacheCommitIDMax = 5 // Maximum number of cached versions}... }Copy the code
- steps
Stage (‘pre-build’) {// Customize the step pre-build
stage('pre-build') { // Custom step pre-build
when {
anyOf { // Multiple branches of judgment, one true pass
branch 'dev'
branch 'release'
branch 'master'
}
}
agent { // The delegate phase runs in a proxy docker NodeJS container
docker {
image 'the node: 10.21.0'
// If you want to run code in a docker container, but also want to use the same nodes or workspace defined by the pipeline, you must set this
reuseNode true
}
}
steps {
sh "printenv" // Prints the Jenkins global environment variable
sh './jenkins/script/pre-build.sh' // We put the main functionality in the pre-build.sh script}}Copy the code
Then create the Jenkins /script/pre-build.sh script
Create a good need to execute, give the script execution permission
chmod +x jenkins/script/pre-build.sh
Copy the code
#! /bin/bash: This script uses **/bin/sh** to explain execution
#! /bin/bashPackageJsonChange =false #package.json file change flag
# set -x
#First set taobao source to speed up the installation speed, here we are slow to replace into our own building faster
npm config set registry https://registry.npm.taobao.org
#Check whether the cache directory existsif [ ! -d $cacheDir ] then echo "no cache dir" mkdir $cacheDir cp package.json "$cacheDir/" touch $cacheCommitIDFile # echo # $GIT_COMMIT > $cacheCommitIDFile first don't have to write the echo "NPM install" NPM I | | exit 1#Check whether the cache file existselif [ ! -f $cachePackage ] || [ ! -f $cacheCommitIDFile ] then if [ ! -f $cachePackage ] then echo "cache file package.json does no exist" echo "cp package.json to cache dir" cp package.json "$cacheDir/" fi if [ ! -f $cacheCommitIDFile ] then echo "cache file commitIdCache.txt does no exist" echo "create commitIdCache.txt to cache dir" echo "write the ID to commitIdCache.txt" touch $cacheCommitIDFile echo $GIT_COMMIT > $cacheCommitIDFile fi rm -rf Node_modules sleep 1 echo "NPM install" NPM I | | exit 1 else echo "cache file exists" # here we assess the new submit commitID is there exists in the cache for CommitId in 'cat "$cachecomdfile "' do if [$commitId = $GIT_COMMIT] then isResetID=true fi done CMP -s package.json $cachePackage cmpFlag=$? #compare package.json if [ $cmpFlag ! If [$isResetID =true] then echo "reset package cp -f Package. json "$cacheDir/" # create a file that represents a rollback flag and provide it to pipeline judgment to skip unnecessary steps. Elif [$packageJsonChange = true] then echo "package update "cp - f package. Json "$cacheDir/" rm - rf node_modules NPM I sleep 1 | | exit 1 else ehco" what all need not stem ha ha "fi fi# set +x
Copy the code
Update push to remote repository, go to Jenkins to view build status
- Once it’s built, let’s go into the workspace, either through the Web interface or the Docker container command line
- Using the Web interface is simple, dot dot dot
- The command line
#The common name for a workspace is the project name _ branch name
ssh env
docker exec -it jenkins /bin/bash
cd var/jenkins_home/workspace
cd vue-ci-start_dev
ls
Copy the code
We can see the directory stage and the files stage/package.json and stage/.com mitidcache.txt that we created in the pre-build step
There are also pulled project files, installed node_modules, etc
PS: Remember this file. Commitidcache. TXT, which we use to judge whether there is a duplicate ID. After we pack and compress successfully, we will write the commitID into the file
3 build – env stage
- Run the packaged script by branch
Here is mainly to build our application, because we have a dev, release, master environment code needs to be built, so use the stage of the parallel,
When webHook triggers an SCM trigger task, we use when to determine which branch is running the build step of which branch
stage('build-env') {
when {
anyOf {
branch 'dev'
branch 'release'
branch 'master'}}// If set to true then any of the concurrent steps fail then all fail,
// This is not needed since we are using it to make build choices when different branch tasks are triggered
failFast false
parallel {
stage('build-dev') {
when {
// beforeAgent indicates that the agent is entered only if the condition of when is right, but not if it is wrong
// It can speed up the assembly line
beoreAgent true
branch 'dev'
}
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
steps {
echo "build-dev"
sh "./jenkins/script/build-dev.sh"
}
}
stage('build-release') {
when {
beforeAgent true
branch 'release'
}
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
steps{
echo "build-release"
sh "./jenkins/script/build-release.sh"
}
}
stage('build-master') {
when {
beforeAgent true
branch 'master'
}
agent {
docker {
image 'the node: 10.21.0'
reuseNode true
}
}
steps {
echo "build-master"
sh "./jenkins/script/build-master.sh"}}}}Copy the code
Create three build scripts in Jenkins /script: build-dev.sh, build-release.sh, and build-master.sh
Corresponding to the construction of dev development environment, release test environment and master production environment respectively
#build-dev.sh
npm run build-dev || exit 1
Copy the code
#build-release.sh
npm run build-release || exit 1
Copy the code
#build-master.sh
npm run build || exit 1
Copy the code
PS: Remember to give chmod +x permission to execute the script
The actual script to run is to define srcipt in package.json
"scripts": {
"serve": "vue-cli-service serve"."build-dev": "vue-cli-service build --mode dev"."build-release": "vue-cli-service build --mode prod"."build": "vue-cli-service build "."lint": "vue-cli-service lint"
},
Copy the code
Env.[mode] : for example, –mode dev is.env.dev in the root directory
For details, see the official vue-CLI documentation
Now let’s test running packaging in each branch
4 artifacts – the manage phase
- After construction, the resource DIST is packaged and compressed and stored in the cache directory or warehouse. Besides, it can be downloaded and consulted, and it is also convenient for rapid deployment after branch rollback
- We also set the number of artifacts to be stored, so that the hard disk space is not enough for multiple builds when there are too many tasks
Our small team uses one of the simplest archiving methods, which of course forces you to build your own Nexus and other platforms
When it comes to the production management stage, the files we successfully packed are all in the dist directory of the root directory, which is controlled by Webpack and can be changed
Jenkinsfile Adds execution script code
stage("artifacts-manage"){
steps {
echo "artifacts"
sh './jenkins/script/artifacts-manage.sh'}}Copy the code
Create a new script in the Jenkins /script directory :artifacts manage.sh,chmod +x give execution permission
#! /bin/bash
#set -x
#Create a cache directory
if [ ! -d $artifactsDir ]
then
mkdir $artifactsDir
fi
#Package and compress dist files
tar -zcvf $artifactsDir/${GIT_COMMIT}_dist.tar.gz dist
FileNum=$(ls -l $artifactsDir | grep ^- | wc -l)
#Update the contents of the cache library
while [ $FileNum -gt $cacheCommitIDMax ]
do
OldFile=$(ls -rt $artifactsDir/* | head -1)
echo "Delete File:$OldFile"
rm -f $OldFile
let "FileNum--"
done
#Here you can save commitID in the previous cache file
CommitIDNum=`cat $cacheCommitIDFile | wc -l`
if [ $CommitIDNum -ge $cacheCommitIDMax ]
then
sed -i '1d' $cacheCommitIDFile
fi
echo
sleep 1
echo $GIT_COMMIT >> $cacheCommitIDFile
# set +x
Copy the code
Let’s do another push to trigger the Jenkins pipeline, and then we’ll go to the Jenkins workspace
We saw the generated dist folder in the project and directory after packaging, and our cache file in stage
View. CommitIDCache. TXT file found we see the git commit ID: 16 c0451ba1368ada4e29ad6ac3ffb44f0ddb52a0 written inside
Look at the stage/artifacts directory file, packaged compressed files found us 16 c0451ba1368ada4e29ad6ac3ffb44f0ddb52a0_dist. Tar. Gz
Since we have set the cache for five versions, let’s test the functionality by doing five commit pushes to see if we can remove the ID and file from the original cache, as you can see in the image above
The cached commit ID and file are updated to the most recent version, after the sixth version was deleted: 😝
Jenkins also provides the archiveArtifacts step to download the artifacts from the successful build
stage("artifacts-manage"){
steps {
echo "artifacts"
sh './jenkins/script/artifacts-manage.sh'
archiveArtifacts artifacts:"${artifactsDir}/${GIT_COMMIT}_dist.tar.gz" // Archive the zip file we built}}Copy the code
So we can easily download Jenkins’ Web page
ArchiveArtifacts will be placed here
${JENKINS_HOME}/jobs/${JOB_NAME}/branch/${BRANCH_NAME}/builds/${BUILD_NUMBER}/stage/archive
Copy the code
5 the reset phase
-
This is a step to perform when detecting rollback operations
-
Here we need to add dot code for the build-env, artifacts manage, and other steps so that we don’t need to execute them when we roll back
Remember the variable resetFlagFile we set earlier in the pipeline environment, and if the same commit ID occurs in [pre-build phase](#2 pre-build phase)
Package cp -f package.json "$cacheDir/" Touch ${resetFlagFile} # ID is not repeated, and package.json is updated...Copy the code
${resetFlagFile} = ${resetFlagFile} = ${resetFlagFile} = ${resetFlagFile} Of course, it is to make a rollback judgment flag
Add when in the build-env, artifacts-manage steps
when {
beforeAgent true
+ expression{
+ return ! (fileExists("${resetFlagFile}"))
+}. }Copy the code
FileExists () is the pipelinen built-in method that determines whether a fileExists and returns a Boolean type
Now let’s do a normal push and a rollback push to see what happens
#Do two normal commit pushes firstgit log commit 7f31a184256b326cb353c506125d58b0efe28481 (HEAD -> dev, origin/dev) Author: longming <[email protected]> Date: Thu Aug 6 13:35:32 2020 + 0800 normal 2 commit acbc3ea45dca16c82a39f2ec6e772eada57229f7 Author: Longming <[email protected]> Date: Thu Aug 6 13:33:20 2020 +0800Copy the code
Doing acbc3ea45dca16c82a39f2ec6e772eada57229f7 rollback to version
git reset -hard acbc3ea45dca16c82a39f2ec6e772eada57229f7
git push -f
Copy the code
Enter the workspace and view the compressed packages under the stage’s.commitidCache.txt, and artifacts directories. You can see that both the write and cache succeeded
The file does not change after the second rollback
Let’s take a look at the pipeline diagram of Blue Ocean
Let’s see, the build-env and artifacts management-phases are skipped in the rollback because you don’t have to redo the build and cache, so you can roll back in seconds
The code for the reset phase is here
stage("reset"){
when {
beforeAgent true
expression{
return fileExists("${resetFlagFile}")
}
anyOf {
branch 'dev'
branch 'release'
branch 'master'
}
}
steps {
echo "It's rolling back aaahhhhh"
// Of course we are here for easy download
archiveArtifacts artifacts:"${artifactsDir}/${GIT_COMMIT}_dist.tar.gz"}}Copy the code
6 deliver phase
This step is used in the development environment, test environment, and pre-release environment, where the packaged code is automatically delivered
-
Installing the Resource Server
Here we install Nginx, look at Nginx in detail
-
Configure SSH
At this stage, we should first configure our Jenkins container to SSH of the server that needs resources. For details, see SSH configuration
The question is, how do you get the deployer? I am in ali cloud 🌥 soha a few months of Hong Kong server ha ha 🍖 (our mainland, the website must first put on record to pass)
We have configured Jenkins SSH connection to the resource server and set the name of host as server
To connect to the resource server, run the Jenkins application container SSH server
Ps: If the user name of the connection is not the owner of the directory /data to upload, please change the permission
#Switch to root and give the owner of the /data directory to Eric, who is our user name for SSH connections to the resource server chmod -R eric /data Copy the code
-
Deliver the script
Add the HOST name of the SSH connection as the environment variable in Jenkinsfile
pipeline{ ... The environment {...+ sshHostName = "server"}... }Copy the code
Code for the Deliver Stage
stage('deliver') { when { beforeAgent true anyOf { branch 'dev' branch 'release' } } steps { echo "start deliver" sh "./jenkins/script/deliver.sh"}}// The rollback flag file needs to be deleted at the end of the pipeline process post{ alway{ } } Copy the code
New deliver. Sh script Jenkins/script/deliver. Sh
#! /bin/bash # set -x #If it is a rollback operation, we don't need to upload the build again, the server already has itif [! SCP ${artifactsDir}/${GIT_COMMIT}_dist. Tar.gz -d ${resetFlagFile}] then # SCP ${artifactsDir}/${GIT_COMMIT}_dist ${sshHostName}:/data/vueci/${BRANCH_NAME}/ || exit 1 fi#Remotely execute script decompression publishing on the server ssh ${sshHostName} /data/vueci/deploy.sh ${BRANCH_NAME} ${GIT_COMMIT} || exit 1 # set +x Copy the code
Create a good need to execute, give the script execution permission
chmod +x jenkins/script/deliver.sh Copy the code
We also have to run to the resource server to create a new published script **/data/vueci/deploy.sh**
tar -zxvf /data/vueci/$1/$2_dist.tar.gz -C /data/vueci/$1/ Copy the code
Create a good need to execute, give the script execution permission
chmod +x /data/vueci/deploy.sh Copy the code
Now let’s test delivery to the DEV environment
git checkout dev #What to changeGit commit -am "commit to devCopy the code
See the effect, in fact, is the development environment ha ha ha ha ha
7 the deploy phase
It’s release time, and this is the pre-release button for the production branch
This step is similar to delivering, with a published input. Take a look at the code
stage('deploy') {
when {
beforeAgent true
branch 'master'
}
steps {
// This generates a button that we use to publish manually
input message: "Should you deploy?"
echo "start deloy"
sh './jenkins/script/deploy.sh'}}Copy the code
New release script deploy. Sh, Jenkins/script/deploy. Sh
Give script execution permission
chmod +x /data/vueci/deploy.sh
Copy the code
The deploy.sh script is no different from the delivered script (although the scripts executed in formal and test environments are generally different)
#! /bin/bash
# set -x
#If it is a rollback operation, we don't need to upload the build again, the server already has itif [ ! SCP ${artifactsDir}/${GIT_COMMIT}_dist. Tar.gz -f $resetFlagFile] then # SCP ${artifactsDir}/${GIT_COMMIT}_dist ${sshHostName}:/data/vueci/${BRANCH_NAME} || exit 1 fi#Remotely execute script decompression publishing on the server
ssh ${sshHostName} /data/vueci/deploy.sh ${BRANCH_NAME} ${GIT_COMMIT} || exit 1
# set +x
Copy the code
Git Flow workflow #Git Flow Workflow #Git Flow workflow
Ps :🌏** Note: dev is a feature branch. ** Dev is a feature branch.
Dev is a feature branch. Dev is a feature branch.
Let’s switch to the dev branch
#Switch to local dev
git checkout dev
#Update remote dev if there are multiple people coordinating development on this branch
git pull
Copy the code
Let’s change what we showed before
< span style =" box-sizing: border-box! Important;
< span style =" box-sizing: border-box; color: RGB (51, 51, 51); line-height: 21px; font-size: 13px! Important; white-space: inherit! Important;"
Copy the code
#A series of commit operationsGit status git add. Git commit -m
#Our function branch is from the remote master branch checkout. During the development process, the master may launch other functions, which need to be updated
git merge origin/master
#If the merger conflicts, we resolve the conflict
git status
git add .
git commit "origin/master"
#Resolve the conflict or no conflict just push
git push
Copy the code
Now we need to go to the remote repository and apply for the functionality to be merged into release for testing
We see that the build task is triggered when the merge is successful
PS: If dev merge release conflict, you can first merge release to dev resolve the conflict and then apply for merge, so there is no conflict in gitee merge side
Merge release test good
Now that we’re ready to go live, we need to merge release into the Master branch
After the merger, we can see
Once the merge is successful, we’re stuck in the deploy phase, looking at the code for the official environment, which hasn’t been updated yet
To publish, click deploy above or the Continue button
8 Notice to inform
-
The deployment is complete, but we don’t know when it will be. We’ll only know if we keep browsing the web, or if we ask Jenkins’ management, which is too much trouble
The company has nail nail or enterprise micro channel can check the information configuration is very simple, here we will talk about the most general mail notification ha
Configuring email notifications we are going to install several plug-ins
Email Extension: This is for setting up your Email
**Config File Provider: ** provides the File storage plug-in that we use to configure the mailbox template
Originally, I configured qq mailbox, but it was slow to receive notifications. I changed it to 163 mailbox of netease. Of course, most companies use enterprise mailbox, and the configuration is almost the same
- Email Enables the SMTP service
Open up our email 163 and open up these two SMTP services, and then get the authorization code, copy it and use it later
- Jenkins configuration mailbox
Remember to test the sending of emails after configuration
- Configuring email Templates
After installing the plugin, Config File Provider will see the template File we used to configure mail under system administration
Click to configure the mail template file
Since we want to send mail notifications after both successful and failed builds, we need two mailbox templates,
Here I designed a simple email template using Sketch
The template code for the corresponding message:
email-success.tpl
<! DOCTYPE html> <html> <head> <meta charset="UTF-8" />
<title>${ENV, var="JOB_NAME"</title> <style> HTML, body {font-family: "SF Pro SC"."SF Pro Text"."SF Pro Icons"."PingFang SC"."Helvetica Neue"."Helvetica"."Arial", sans-serif;
}
.table.success {
border: 1px solid #ddd;
color: #4a4a4a;
}
.table.success td {
font-size: 16px;
overflow: hidden;
word-break: break-all;
}
td.info {
background-color: #dfb051;
height: 44px;
color: #fff;
}
td:first-child {
padding-left: 49px;
padding-right: 49px;
}
td.build-status {
background-color: #76ac35;
height: 44px;
color: #fff;
}
.table.success td pre {
white-space: pre-wrap;
}
.success-ico {
vertical-align: middle;
margin-left: 10px;
width: 25px;
/* height: 30px; * /
/* margin-bottom: 20px; * /
}
td.info-title {
color: #76ac35;
height: 36px;
font-weight: bold;
font-size: 16px;
}
.table.success ul {
list-style: none;
margin: 0;
padding: 0;
}
.table.success ul li {
list-style: none;
}
ul li {
height: 35px;
line-height: 35px;
font-size: 14px;
white-space: nowrap;
}
hr {
margin-left: 0;
}
</style>
</head>
<body leftmargin="8" marginwidth="0" topmargin="8" marginheight="4" offset="0">
<table width="95%" cellpadding="0" cellspacing="0" class="table success">
<tr>
<td class="info"> < p style = "max-width: 100%; clear: both; min-height: 1emjenkinsAutomatically issued, please do not reply! </td>
</tr>
</table>
<br />
<table width= "95%"cellpadding= "0"cellspacing= "0"class="table success">
<tr>
<td class="build-status"> < span style = "max-width: 100%;img
class="success-ico"
src="data:image/png;base64.iVBORw0KGgoAAAANSUhEUgAAACMAAAAjCAYAAAFp3oPPAAAAAXNSR0IArs4c6QAAA/ZJREFUWAm9mLtPVEEUxlkDAaKhQY1iKyWPAmNhYazUAmOMrZRYGQtii9FWwI4e/wA1JibamKixQChITEisqFgNaGI0FDasv2/unNm5r727sDLJt+f1nTNz53VvtqfHt0ajUTe9ZookgdeIN1ICwwiOSaBhDuRHp+O7JwX5wTm8MY7jlXOg9IO5EC1UYLgWglgDwTAFZ7/pJvFtOx1lInKOR/od6cQH9eOGhSxsliSiK4tcMGYIZhUIr0AdhG4DB2cywOBxY2k+CASbQy2Ba/jcjCETIoqbD6R7GrHQQ3fo4emTEgW/kOq9PjNey0Ct0TDWVdqtJzLXAjtWItaC/Njpp8YRdgj6AHBPHRcJuoJKEDE4vYJv0MdaFkh3n61SYFM0naNeYh62a7FPepEfX5KLkhyEbFYHtmrU+KmzPCPWYwf5PX5dNdL6MRKvo9jELnZQyHF97jWXh6GVcpsU2Quqmu1AXQv5FcS5DZrHo2R44ohbEk67IU4AO6Q6qNKrT0y6jFvqeRLXwV8v57OcQhtyH9gBN 4oI8vt4X1FcSzcFVgqDGad44qfcOEbAg5SzwhBfeYGGsWkG+gnwzexYyq+ 4 +dCTPJRH5pSUDdRGM/ 7RxJ3nK2kvJvtCT7O+先生/L5+d+UUK7PuXrLK21m1DCW1MRe8MuUbWTtuSL1lTE1vtFSU9lbuNP6Y7Z4644LqaGUZaR9Uf3y55G4l4BnnQ3Sy6xY16SzwDizbasEbVoy1YYTsjTY6S2PfYw0EGM m+zhqEB62ysA4XAHMKp+uKvACplkZG1fSm6jWWKZpOAQMX2BzIDT4C14D3Tf6sT/ACfBWXAOXAZXwQ54BhbZEr+RB2sMQIv2GWyCwhuz3crK93VUr3mXVRWArAX/Azq6NKvqWlx1fX07uRZqSgjaLBr5StP7/zT1A1ZB+j2EYwjonEwXdY9fHwLW9lHCeSrh6xyKZy3/MUAiQXsxak8mDeeWAmZnpU9CuLbLb3+WE9uKA/GsVdXeUr4uTL3/J9ntt+KCR6kzhuf0t6Gp0ofJ7aPsPNuX+tc49JE3Br7EBAI3sSdjX5f1DVbiZVRT/Y9pME5Bfo2C59EfRna31ey14SakcM8wM3pvfgenuj0K6u2CM8zMvtWmv2TPyIGxBVI7HlvH8xfoZlO91LWArePtTpMbHIbdM6kB+YFWfQSQ3lYLHwuu02QS8veM77TlDUx3s6DTmRJ/ 1jqPJf7cDZx7a0O6QNI78Jh1fRIXMB2O9tQlcBHoxaep/wn0D88q+BTvCezQyNXm1d12Bc56CLRSSIrf2tOtuFUxak0Dvf1bvrVzM1NUmCJD+OfADMh+z2g2NCuaHc3Sgb9n/gFyaAu+lH9E5AAAAABJRU5ErkJggg= = "/ > < /td>
</tr>
<tr>
<td class="info-title"> < span style = "max-width: 100%; -- <hr size= "2"width= "100%"align="center"/ > -- > < /td>
</tr>
<tr>
<td>
<ul>
<li> Project name: ${${CAUSE}</li> <li> Build log: <a href="${BUILD_URL}console">${BUILD_URL}console</a></li> <li"${PROJECT_URL}">${PROJECT_URL}</a></li>
</ul>
<hr size="2" width="100%" />
</td>
</tr>
<tr>
<td class="info-title"> < span style = "max-width: 100%td>
</tr>
<tr>
<td>
<ul>
<li>
${CHANGES_SINCE_LAST_SUCCESS,reverse=true,format="Updated by: %c",showPaths=true,changesFormat="[%a]",pathFormat=" %p"}
</li>
<li>
${CHANGES_SINCE_LAST_SUCCESS,reverse=true,format="Update info: %c",showPaths=true,changesFormat="%m",pathFormat=" %p"} </li> </ul> <! -- ${CHANGES_SINCE_LAST_SUCCESS,reverse=true, format="Changes for Build #%n:<br />%c<br />",showPaths=true,changesFormat="<pre>[%a]<br />%m</pre>",pathFormat=" %p"} -- > <! -- <hr size="2" width="100%"/> --> </td> </tr> <! -- <tr> <tdclass="info-title"> Build log (last 100 lines)</td>
</tr>
<tr>
<td><p><pre> ${BUILD_LOG, maxLines=100}</pre></p></td>
</tr> -->
</table>
</body>
</html>
Copy the code
email-fail.tpl
<! DOCTYPE html> <html> <head> <meta charset="UTF-8" />
<title>${ENV, var="JOB_NAME"</title> <style> HTML, body {font-family: "SF Pro SC"."SF Pro Text"."SF Pro Icons"."PingFang SC"."Helvetica Neue"."Helvetica"."Arial", sans-serif;
}
.table.success {
border: 1px solid #ddd;
color: #4a4a4a;
}
.table.success td {
font-size: 16px;
overflow: hidden;
word-break: break-all;
}
td.info {
background-color: #dfb051;
height: 44px;
color: #fff;
}
td:first-child {
padding-left: 49px;
padding-right: 49px;
}
td.build-status {
background-color: #76ac35;
height: 44px;
color: #fff;
}
.table.success td pre {
white-space: pre-wrap;
}
.success-ico {
vertical-align: middle;
margin-left: 10px;
width: 25px;
/* height: 30px; * /
/* margin-bottom: 20px; * /
}
td.info-title {
color: #76ac35;
height: 36px;
font-weight: bold;
font-size: 16px;
}
.table.success ul {
list-style: none;
margin: 0;
padding: 0;
}
.table.success ul li {
list-style: none;
}
ul li {
height: 35px;
line-height: 35px;
font-size: 14px;
white-space: nowrap;
}
hr {
margin-left: 0;
}
</style>
</head>
<body leftmargin="8" marginwidth="0" topmargin="8" marginheight="4" offset="0">
<table width="95%" cellpadding="0" cellspacing="0" class="table success">
<tr>
<td class="info"> < p style = "max-width: 100%; clear: both; min-height: 1emjenkinsAutomatically issued, please do not reply! </td>
</tr>
</table>
<br />
<table width= "95%"cellpadding= "0"cellspacing= "0"class="table success">
<tr>
<td class="build-status"> < span style = "max-width: 100%;img
class="success-ico"
src="data:image/png;base64.iVBORw0KGgoAAAANSUhEUgAAACMAAAAjCAYAAAFp3oPPAAAAAXNSR0IArs4c6QAAA/ZJREFUWAm9mLtPVEEUxlkDAaKhQY1iKyWPAmNhYazUAmOMrZRYGQtii9FWwI4e/wA1JibamKixQChITEisqFgNaGI0FDasv2/unNm5r727sDLJt+f1nTNz53VvtqfHt0ajUTe9ZookgdeIN1ICwwiOSaBhDuRHp+O7JwX5wTm8MY7jlXOg9IO5EC1UYLgWglgDwTAFZ7/pJvFtOx1lInKOR/od6cQH9eOGhSxsliSiK4tcMGYIZhUIr0AdhG4DB2cywOBxY2k+CASbQy2Ba/jcjCETIoqbD6R7GrHQQ3fo4emTEgW/kOq9PjNey0Ct0TDWVdqtJzLXAjtWItaC/Njpp8YRdgj6AHBPHRcJuoJKEDE4vYJv0MdaFkh3n61SYFM0naNeYh62a7FPepEfX5KLkhyEbFYHtmrU+KmzPCPWYwf5PX5dNdL6MRKvo9jELnZQyHF97jWXh6GVcpsU2Quqmu1AXQv5FcS5DZrHo2R44ohbEk67IU4AO6Q6qNKrT0y6jFvqeRLXwV8v57OcQhtyH9gBN 4oI8vt4X1FcSzcFVgqDGad44qfcOEbAg5SzwhBfeYGGsWkG+gnwzexYyq+ 4 +dCTPJRH5pSUDdRGM/ 7RxJ3nK2kvJvtCT7O+先生/L5+d+UUK7PuXrLK21m1DCW1MRe8MuUbWTtuSL1lTE1vtFSU9lbuNP6Y7Z4644LqaGUZaR9Uf3y55G4l4BnnQ3Sy6xY16SzwDizbasEbVoy1YYTsjTY6S2PfYw0EGM m+zhqEB62ysA4XAHMKp+uKvACplkZG1fSm6jWWKZpOAQMX2BzIDT4C14D3Tf6sT/ACfBWXAOXAZXwQ54BhbZEr+RB2sMQIv2GWyCwhuz3crK93VUr3mXVRWArAX/Azq6NKvqWlx1fX07uRZqSgjaLBr5StP7/zT1A1ZB+j2EYwjonEwXdY9fHwLW9lHCeSrh6xyKZy3/MUAiQXsxak8mDeeWAmZnpU9CuLbLb3+WE9uKA/GsVdXeUr4uTL3/J9ntt+KCR6kzhuf0t6Gp0ofJ7aPsPNuX+tc49JE3Br7EBAI3sSdjX5f1DVbiZVRT/Y9pME5Bfo2C59EfRna31ey14SakcM8wM3pvfgenuj0K6u2CM8zMvtWmv2TPyIGxBVI7HlvH8xfoZlO91LWArePtTpMbHIbdM6kB+YFWfQSQ3lYLHwuu02QS8veM77TlDUx3s6DTmRJ/ 1jqPJf7cDZx7a0O6QNI78Jh1fRIXMB2O9tQlcBHoxaep/wn0D88q+BTvCezQyNXm1d12Bc56CLRSSIrf2tOtuFUxak0Dvf1bvrVzM1NUmCJD+OfADMh+z2g2NCuaHc3Sgb9n/gFyaAu+lH9E5AAAAABJRU5ErkJggg= = "/ > < /td>
</tr>
<tr>
<td class="info-title"> < span style = "max-width: 100%; -- <hr size= "2"width= "100%"align="center"/ > -- > < /td>
</tr>
<tr>
<td>
<ul>
<li> Project name: ${${CAUSE}</li> <li> Build log: <a href="${BUILD_URL}console">${BUILD_URL}console</a></li> <li"${PROJECT_URL}">${PROJECT_URL}</a></li>
</ul>
<hr size="2" width="100%" />
</td>
</tr>
<tr>
<td class="info-title"> < span style = "max-width: 100%td>
</tr>
<tr>
<td>
<ul>
<li>
${CHANGES_SINCE_LAST_SUCCESS,reverse=true,format="Updated by: %c",showPaths=true,changesFormat="[%a]",pathFormat=" %p"}
</li>
<li>
${CHANGES_SINCE_LAST_SUCCESS,reverse=true,format="Update info: %c",showPaths=true,changesFormat="%m",pathFormat=" %p"} </li> </ul> <! -- ${CHANGES_SINCE_LAST_SUCCESS,reverse=true, format="Changes for Build #%n:<br />%c<br />",showPaths=true,changesFormat="<pre>[%a]<br />%m</pre>",pathFormat=" %p"} -- > <! -- <hr size="2" width="100%"/> --> </td> </tr> <! -- <tr> <tdclass="info-title"> Build log (last 100 lines)</td>
</tr>
<tr>
<td><p><pre> ${BUILD_LOG, maxLines=100}</pre></p></td>
</tr> -->
</table>
</body>
</html>
Copy the code
I keep the two files in the/Jenkins directory for future changes or copy debugging
The template is set. How do I use it? The following code tells you
After configuring the template, we need to set it up in the code
post {
changed{
echo 'I changed! '
}
failure{
echo 'I failed! '
configFileProvider([configFile(fileId: 'email-tmp-fail'.targetLocation: 'email-fail.html'.variable: 'content')]) {
script {
template = readFile encoding: 'UTF-8'.file: "${content}"
emailext(
subject: "Job [${env.JOB_NAME}] - Status: fail".body: """${template}""".recipientProviders: [culprits(),requestor(),developers()],
to: "[email protected]",
)
}
}
}
success{
echo 'I success'
configFileProvider([configFile(fileId: 'email-tmp-success'.targetLocation: 'email-success.html'.variable: 'content')]) {
script {
template = readFile encoding: 'UTF-8'.file: "${content}"
emailext(
subject: "Job [${env.JOB_NAME}] - Status: Success".body: """${template}""".recipientProviders: [requestor(),developers()],
to: "[email protected]",
)
}
}
}
always{
echo 'I always'
script{
if(fileExists("${resetFlagFile}")){
sh "rm -r ${resetFlagFile}" // Rollback marks file deletion
}
}
}
unstable{
echo "unstable"
}
aborted{
echo "aborted"}}Copy the code
-
ConfigFileProvider: is how we install the plug-in
- FileId: is the ID of the mailbox template file we just set
- TargetLocation: is where our template files are compiled, and here is the emali-fail.html in the root directory
- Variable: template compilation file
-
ReadFile is a method built into pipeline to read files
-
Emailext: is a method that plugins Email Extension
-
Subject: indicates the String type, the subject of the email.
-
Body: type String, the content of the email
-
From: sender’s email (which we configured by default)
-
To: Indicates a String recipient
-
RecipientProviders: List. Recipient List
Type the name Helper method name describe Culprits culprits() The person who caused the build failure. List of change submitters between last build success and last build failure Developers developers() A list of all submitters of the changes involved in this build Requestor requestor() The person requesting the build, generally the person triggering the build manually
-
I’m going to trigger a failed and successful build test to see how email notification works
Of course, it can also configure the notification of Dingpin and enterprise wechat, as well as HTTP custom notification
The next article
Docker-compose: Docker-compose: docker-compose: docker-compose: docker-compose: docker-compose: Docker-compose: Docker-compose
Click here to go to the next post
Get the complete mini-manual
Concern public number: “front-end small manual”, reply: small manual
You can download a PDF version of the entire article
Typora’s night theme is:
Thank you for your attention
Finally, I hope I can get your attention ~~
Your small attention is our great power ah, we will continue to push original and good articles to you
This is the QR code of our official account